Back to MCP Catalog

Apify Actors MCP Server

Cloud PlatformsTypeScript
Run Apify Actors through the Model Context Protocol
Available Tools

actor.list

Lists available Apify Actors that can be run

actor.run

Runs an Apify Actor with the specified parameters and returns the results

help

Provides information about available tools and how to use them

Apify Actors MCP Server provides a bridge between AI assistants and Apify's ecosystem of web scraping and automation tools. It implements the Model Context Protocol (MCP) standard, allowing AI models to discover and execute Apify Actors - cloud-based programs that can scrape websites, process data, and automate various web tasks. This integration enables AI assistants to perform complex web operations without requiring users to write code.

Apify Actors MCP Server

The Apify Actors MCP Server allows AI assistants to interact with Apify's ecosystem of web automation tools through the Model Context Protocol (MCP). This server enables AI models to discover available Actors (cloud-based programs for web scraping and automation) and execute them with specified parameters.

Prerequisites

  • Node.js 18 or newer
  • An Apify account and API token (sign up at apify.com if you don't have one)

Installation

You can install the Apify Actors MCP Server in several ways:

Option 1: Using npm

npm install @apify/actors-mcp-server

Option 2: Using Docker

docker pull apify/actors-mcp-server

Configuration

The server requires an Apify API token to function. You can set this up using environment variables:

  1. Create a .env file in your project root (based on the .env.example template)
  2. Add your Apify token:
    APIFY_TOKEN=your_apify_token_here
    

Running the Server

Using npm

npx @apify/actors-mcp-server

Using Docker

docker run -p 3000:3000 -e APIFY_TOKEN=your_apify_token_here apify/actors-mcp-server

By default, the server runs on port 3000. You can customize this by setting the PORT environment variable.

Integrating with AI Assistants

To connect an AI assistant to the Apify Actors MCP Server, you'll need to configure the assistant to use the server's endpoint. The exact configuration depends on the AI platform you're using.

For example, to integrate with Claude or other compatible assistants, add the server to your configuration:

{
  "mcpServers": {
    "apify-actors": {
      "url": "http://localhost:3000"
    }
  }
}

Usage

Once integrated, AI assistants can discover available Apify Actors and execute them. The server provides tools for:

  1. Discovering available Actors
  2. Running Actors with specific parameters
  3. Retrieving results from Actor runs

The AI assistant can use these capabilities to perform web scraping, data processing, and other automation tasks based on user requests.

Advanced Configuration

The server supports several advanced configuration options through environment variables:

  • PORT: The port on which the server listens (default: 3000)
  • APIFY_API_BASE_URL: Custom Apify API URL (default: https://api.apify.com)
  • LOG_LEVEL: Controls verbosity of logs (options: debug, info, warn, error)

For more detailed information and advanced usage scenarios, visit the official documentation.

Related MCPs

AWS CLI
Cloud PlatformsPython

Execute AWS CLI commands securely through AI assistants

Kubernetes
Cloud PlatformsGo

Connect to and manage Kubernetes clusters through natural language

Cloudflare
Cloud PlatformsTypeScript

A Model Context Protocol server for Cloudflare services

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.