Downloads a website for offline viewing, preserving its structure and converting links to work locally
Website Downloader is a powerful tool that allows you to download complete websites for offline access. It preserves the original website structure, converts links to work locally, and includes all necessary resources like CSS, images, and scripts. Built on top of the reliable wget utility, this tool provides a simple interface to recursively download websites with customizable depth settings. It's perfect for creating local archives, offline documentation, or preserving web content that might change or disappear.
Website Downloader is an MCP server that provides a straightforward way to download entire websites for offline viewing. It leverages the powerful wget utility to recursively download web pages while maintaining their structure and functionality.
Before using this MCP server, you need to have wget
installed on your system:
brew install wget
sudo apt-get update
sudo apt-get install wget
sudo dnf install wget
Using Chocolatey:
choco install wget
Or download the binary from https://eternallybored.org/misc/wget/ and place it in a directory that's in your PATH.
git clone https://github.com/pskill9/website-downloader.git
cd website-downloader
npm install
npm run build
The Website Downloader provides a download_website
tool that accepts the following parameters:
url
(required): The URL of the website to downloadoutputPath
(optional): The directory where the website should be downloaded (defaults to current directory)depth
(optional): Maximum depth level for recursive downloading (defaults to infinite)Setting the depth parameter can be useful to limit the scope of the download:
0
: Download only the specified page1
: Download the specified page and all directly linked pages2
: Download two levels deepWhen the depth parameter is omitted, the tool will download the entire website structure.