Back to MCP Catalog

Langflow Document Q&A MCP Server

Knowledge & MemoryTypeScript
Query documents through a Langflow backend
Available Tools

query_docs

Query the document Q&A system with a question

query

Langflow Document Q&A is a TypeScript-based MCP server that implements a document question-answering system. It connects to a Langflow backend to provide a simple interface for querying documents, demonstrating core Model Context Protocol concepts in a practical application. This server allows you to leverage Langflow's document processing capabilities directly through Claude or other MCP-compatible clients.

Overview

Langflow Document Q&A is an MCP server that connects to a Langflow backend to provide document question-answering capabilities. This server allows you to query documents that have been processed by a Langflow document Q&A flow.

Prerequisites

Before using this MCP server, you need to set up a Langflow document Q&A flow:

  1. Open Langflow and create a new flow from the "Document Q&A" template
  2. Configure your flow with necessary components (ChatInput, File Upload, LLM, etc.)
  3. Save your flow
  4. Click the "API" button in the top right corner of Langflow
  5. Copy the API endpoint URL from the cURL command (format: http://127.0.0.1:7860/api/v1/run/<flow-id>?stream=false)
  6. Save this URL as it will be needed for the API_ENDPOINT configuration

Installation

Manual Installation

To use with Claude Desktop, add the server configuration to your Claude Desktop config file:

On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "langflow-doc-qa-server": {
      "command": "node",
      "args": [
        "/path/to/doc-qa-server/build/index.js"
      ],
      "env": {
        "API_ENDPOINT": "http://127.0.0.1:7860/api/v1/run/YOUR_FLOW_ID"
      }
    }
  }
}

Replace /path/to/doc-qa-server/build/index.js with the actual path to the built server and YOUR_FLOW_ID with your Langflow flow ID.

Installation via Smithery

For automatic installation via Smithery:

npx -y @smithery/cli install @GongRzhe/Langflow-DOC-QA-SERVER --client claude

During installation, you'll be prompted to provide your Langflow API endpoint.

Configuration

The server supports the following environment variables:

  • API_ENDPOINT: The endpoint URL for the Langflow API service. You must set this to your Langflow flow's API endpoint.

Usage

Once installed, you can use the Document Q&A server in your conversations with Claude. To query documents, simply ask Claude to use the document Q&A tool to find information in your documents.

Example prompts:

  • "Can you use the document Q&A tool to find information about X in my documents?"
  • "Search my uploaded documents for information about Y"

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. You can use the MCP Inspector for debugging:

npm run inspector

The Inspector will provide a URL to access debugging tools in your browser.

Related MCPs

Knowledge Graph Memory
Knowledge & MemoryTypeScript

A persistent memory system using a local knowledge graph

MemoryMesh
Knowledge & MemoryTypeScript

A knowledge graph server for structured memory persistence in AI models

Cognee
Knowledge & MemoryPython

Knowledge management and retrieval system with code graph capabilities

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.