Back to MCP Catalog

LangChain Integration MCP Server

Developer ToolsPython
Integrate Model Context Protocol tools with LangChain

LangChain MCP Integration provides a bridge between the Model Context Protocol (MCP) and LangChain, allowing developers to use MCP tools within their LangChain applications. This integration enables AI models to interact with external tools and services through the standardized MCP interface while leveraging LangChain's powerful orchestration capabilities. The library offers a simple API to initialize MCP tools and convert them into LangChain-compatible tools that can be used in chains, agents, and other LangChain constructs. This makes it easy to extend AI applications with file system access, web browsing, database queries, and other capabilities provided by MCP servers.

Overview

LangChain MCP Integration allows you to use Model Context Protocol (MCP) tools within your LangChain applications. This integration bridges the gap between MCP's standardized tool interface and LangChain's agent framework.

Installation

Install the package using pip:

pip install langchain-mcp

Usage

To use LangChain MCP Integration, you need to:

  1. Set up an MCP server
  2. Create an MCP ClientSession
  3. Initialize the MCPToolkit
  4. Get the LangChain tools and use them in your application

Here's a basic example:

import pathlib
from mcp import ClientSession, StdioServerParameters
from mcp.stdio import stdio_client
from langchain_mcp import MCPToolkit

# Set up an MCP server (in this example, using the filesystem server)
server_params = StdioServerParameters(
    command="npx",
    args=["-y", "@modelcontextprotocol/server-filesystem", str(pathlib.Path(__file__).parent)]
)

async def run_with_mcp(prompt):
    # Create an MCP client session
    async with stdio_client(server_params) as (read, write):
        async with ClientSession(read, write) as session:
            # Initialize the MCPToolkit
            toolkit = MCPToolkit(session=session)
            await toolkit.initialize()
            
            # Get the LangChain tools
            tools = toolkit.get_tools()
            
            # Use the tools with your LangChain setup
            # For example, with a chat model and agent
            from langchain_core.prompts import ChatPromptTemplate
            from langchain_openai import ChatOpenAI
            from langchain.agents import create_openai_functions_agent
            from langchain.agents import AgentExecutor
            
            model = ChatOpenAI()
            prompt_template = ChatPromptTemplate.from_messages([
                ("system", "You are a helpful assistant with access to tools."),
                ("human", "{input}")
            ])
            
            agent = create_openai_functions_agent(model, tools, prompt_template)
            agent_executor = AgentExecutor(agent=agent, tools=tools)
            
            response = await agent_executor.ainvoke({"input": prompt})
            return response["output"]

Example: File System Access

Here's a complete example that uses the MCP filesystem server to read and summarize a file:

import os
import pathlib
import asyncio
from mcp import ClientSession, StdioServerParameters
from mcp.stdio import stdio_client
from langchain_mcp import MCPToolkit
from langchain_openai import ChatOpenAI
from langchain.agents import create_openai_functions_agent, AgentExecutor
from langchain_core.prompts import ChatPromptTemplate

async def run(tools, prompt):
    model = ChatOpenAI(model="gpt-3.5-turbo")
    prompt_template = ChatPromptTemplate.from_messages([
        ("system", "You are a helpful assistant with access to tools."),
        ("human", "{input}")
    ])
    
    agent = create_openai_functions_agent(model, tools, prompt_template)
    agent_executor = AgentExecutor(agent=agent, tools=tools)
    
    response = await agent_executor.ainvoke({"input": prompt})
    return response["output"]

async def main():
    # Set up the MCP filesystem server
    server_params = StdioServerParameters(
        command="npx",
        args=["-y", "@modelcontextprotocol/server-filesystem", str(pathlib.Path(__file__).parent)]
    )
    
    # Create prompt
    prompt = "Read and summarize the file ./README.md"
    
    # Run with MCP
    async with stdio_client(server_params) as (read, write):
        async with ClientSession(read, write) as session:
            toolkit = MCPToolkit(session=session)
            await toolkit.initialize()
            response = await run(toolkit.get_tools(), prompt)
            print(response)

if __name__ == "__main__":
    asyncio.run(main())

Notes

  • The library requires Python 3.10 or higher
  • You need to have the appropriate MCP servers installed for the tools you want to use
  • For production use, consider using a more robust MCP server setup rather than the stdio-based approach shown in the examples

Related MCPs

Apple Shortcuts
Developer ToolsJavaScript

Control Apple Shortcuts automations from AI assistants

Clojars Dependency Lookup
Developer ToolsJavaScript

Fetch dependency information from Clojars, the Clojure community's artifact repository

Simple Timeserver
Developer ToolsPython

Provides Claude with current time and timezone information

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.