Back to MCP Catalog

Website Downloader MCP Server

Command LineJavaScript
Download entire websites with preserved structure for offline viewing
Available Tools

download_website

Downloads a website for offline viewing, preserving its structure and converting links to work locally

urloutputPathdepth

Website Downloader is a powerful tool that allows you to download complete websites for offline access. It preserves the original website structure, converts links to work locally, and includes all necessary resources like CSS, images, and scripts. Built on top of the reliable wget utility, this tool provides a simple interface to recursively download websites with customizable depth settings. It's perfect for creating local archives, offline documentation, or preserving web content that might change or disappear.

Overview

Website Downloader is an MCP server that provides a straightforward way to download entire websites for offline viewing. It leverages the powerful wget utility to recursively download web pages while maintaining their structure and functionality.

Prerequisites

Before using this MCP server, you need to have wget installed on your system:

macOS

brew install wget

Linux (Debian/Ubuntu)

sudo apt-get update
sudo apt-get install wget

Linux (Red Hat/Fedora)

sudo dnf install wget

Windows

Using Chocolatey:

choco install wget

Or download the binary from https://eternallybored.org/misc/wget/ and place it in a directory that's in your PATH.

Installation

  1. Clone the repository:
git clone https://github.com/pskill9/website-downloader.git
cd website-downloader
  1. Install dependencies and build the server:
npm install
npm run build
  1. Add the server to your MCP client configuration (see the Installation section below).

Features

  • Recursive downloading: Automatically follows links to download entire websites
  • Preserves website structure: Maintains the original directory structure
  • Converts links: Updates links to work locally
  • Includes all resources: Downloads CSS, images, JavaScript, and other required files
  • Domain restriction: Only downloads content from the specified domain
  • Customizable depth: Control how many levels deep the crawler should go

Usage

The Website Downloader provides a download_website tool that accepts the following parameters:

  • url (required): The URL of the website to download
  • outputPath (optional): The directory where the website should be downloaded (defaults to current directory)
  • depth (optional): Maximum depth level for recursive downloading (defaults to infinite)

Setting the depth parameter can be useful to limit the scope of the download:

  • 0: Download only the specified page
  • 1: Download the specified page and all directly linked pages
  • 2: Download two levels deep
  • etc.

When the depth parameter is omitted, the tool will download the entire website structure.

Related MCPs

iTerm Terminal Control
Command LineTypeScript

Execute and interact with commands in your active iTerm terminal session

Command Runner
Command LineTypeScript

Run shell commands directly from your AI assistant

CLI Command Executor
Command LinePython

Secure command-line interface with customizable security policies

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.