Back to MCP Catalog

LlamaCloud Index Connector MCP Server

Knowledge & MemoryTypeScript
Connect to and search multiple managed indexes on LlamaCloud
Available Tools

get_information_index_name

Searches a specific LlamaCloud index for information. The actual tool name is dynamically generated based on the index name provided in the configuration.

query

LlamaCloud Index Connector provides a seamless way to access and search through multiple managed indexes hosted on LlamaCloud. Each index becomes a separate tool that can be queried independently, allowing AI assistants to retrieve specific information from different knowledge bases. This MCP server creates dynamic tools based on your configuration, with each tool connecting to a specific LlamaCloud index. This makes it easy to organize and access different datasets, such as company documents, technical documentation, or any other information stored in your LlamaCloud indexes.

Overview

LlamaCloud Index Connector allows you to connect AI assistants to multiple managed indexes on LlamaCloud. Each index becomes a separate tool that can be queried independently, making it easy to organize and access different knowledge bases.

Prerequisites

To use this MCP server, you'll need:

  1. A LlamaCloud account with one or more indexes created
  2. Your LlamaCloud project name
  3. Your LlamaCloud API key

Installation

To add the LlamaCloud Index Connector to your MCP client (such as Claude Desktop, Windsurf, or Cursor), you'll need to update your MCP client configuration.

Configuration Location

For Claude Desktop:

  • MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%/Claude/claude_desktop_config.json

Configuration Format

Add the following to your MCP client configuration:

{
  "mcpServers": {
    "llamacloud": {
      "command": "npx",
      "args": [
        "-y",
        "@llamaindex/mcp-server-llamacloud",
        "--index",
        "YOUR_INDEX_NAME_1",
        "--description",
        "Description of your first index",
        "--index",
        "YOUR_INDEX_NAME_2",
        "--description",
        "Description of your second index"
      ],
      "env": {
        "LLAMA_CLOUD_PROJECT_NAME": "YOUR_PROJECT_NAME",
        "LLAMA_CLOUD_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

Replace:

  • YOUR_INDEX_NAME_1, YOUR_INDEX_NAME_2: Names of your LlamaCloud indexes
  • YOUR_PROJECT_NAME: Your LlamaCloud project name
  • YOUR_API_KEY: Your LlamaCloud API key

You can add as many indexes as you need by adding additional pairs of --index and --description arguments.

Usage

Once configured, the MCP server will create a separate tool for each index you've defined. The tool names are automatically generated based on the index names, following the format get_information_index_name.

For example, if you defined an index named "10k-SEC-Tesla", the tool would be named get_information_10k_SEC_Tesla.

To use a tool, simply ask the AI assistant to search for information in a specific index. For example:

"Can you search the Tesla SEC documents for information about their revenue in 2023?"

The AI assistant will use the appropriate tool to query the index and return the relevant information.

Troubleshooting

If you encounter issues:

  1. Verify your LlamaCloud API key and project name are correct
  2. Ensure your indexes exist in your LlamaCloud project
  3. Check that the index names in your configuration match exactly with those in LlamaCloud

For development and debugging purposes, you can use the MCP Inspector:

npx @modelcontextprotocol/inspector

This will provide a URL to access debugging tools in your browser.

Related MCPs

Knowledge Graph Memory
Knowledge & MemoryTypeScript

A persistent memory system using a local knowledge graph

MemoryMesh
Knowledge & MemoryTypeScript

A knowledge graph server for structured memory persistence in AI models

Cognee
Knowledge & MemoryPython

Knowledge management and retrieval system with code graph capabilities

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.