Search for Intercom conversations with advanced filtering options
Find all conversations associated with a specific customer by email
Search conversations using Intercom's search API with advanced filtering
The Intercom Integration MCP provides AI assistants with seamless access to customer support conversations and tickets stored in Intercom. It enables advanced searching and filtering capabilities, allowing you to find specific customer interactions by email, status, date range, and keywords. With server-side filtering via Intercom's search API, this integration delivers efficient performance even with large datasets. It's designed to help support teams, product managers, and customer success professionals gain deeper insights from their customer communications.
The Intercom Integration MCP server connects AI assistants to your Intercom customer support data, enabling powerful analysis and retrieval of conversations and tickets. This integration is particularly valuable for support teams, customer success managers, and product teams who need to analyze customer feedback and support interactions.
Before installing this MCP server, you'll need:
If you prefer to install directly via NPM:
# Install the package globally
npm install -g mcp-server-for-intercom
# Set your Intercom API token
export INTERCOM_ACCESS_TOKEN="your_token_here"
# Run the server
intercom-mcp
The Docker approach provides a more isolated and consistent environment:
# Build the Docker image
docker build -t mcp-intercom .
# Run the container with your API token
docker run --rm -it -p 3000:3000 -p 8080:8080 -e INTERCOM_ACCESS_TOKEN="your_token_here" mcp-intercom:latest
If you prefer a version without Glama-specific dependencies:
# Build the standard image
docker build -t mcp-intercom-standard -f Dockerfile.standard .
# Run the standard container
docker run --rm -it -p 3000:3000 -e INTERCOM_ACCESS_TOKEN="your_token_here" mcp-intercom-standard:latest
The MCP server supports several environment variables for customization:
INTERCOM_ACCESS_TOKEN
(required): Your Intercom API tokenPORT
(optional): The port on which to run the MCP server (default: 3000)DEFAULT_DATE_RANGE_DAYS
(optional): Default number of days to look back when searching (default varies by endpoint)MAX_RESULTS_LIMIT
(optional): Maximum number of results to return (default varies by endpoint)KEYWORD_FILTERS
(optional): Default keyword filters to apply to searchesAfter starting the server, you can verify it's working correctly with:
# Test the server status (if using Glama-compatible version)
curl -v http://localhost:8080/.well-known/glama.json
# Test the MCP endpoint
curl -X POST -H "Content-Type: application/json" -d '{"jsonrpc":"2.0","id":1,"method":"mcp.capabilities"}' http://localhost:3000
Once your server is running, you can connect it to MCP-compatible AI assistants. The server exposes several tools for searching and analyzing Intercom data, allowing the AI to retrieve relevant customer conversations based on various criteria.
The integration is particularly useful for: