Back to MCP Catalog

Lark Bitable MCP Server

DatabasesPython
Access and query Lark Bitable tables through Model Context Protocol
Available Tools

list_table

List tables for the current Bitable.

describe_table

Describe a table by its name, returning a list of columns in the table.

name

read_query

Execute a SQL query to read data from the tables.

sql

The Lark Bitable MCP server provides a seamless interface to interact with Lark Bitable databases through the Model Context Protocol. It enables users to list tables, describe table structures, and execute SQL queries against Bitable data. With this integration, AI assistants can directly access and manipulate data stored in Lark Bitable, making it possible to analyze information, generate reports, and perform data operations without leaving your conversation context.

Overview

The Lark Bitable MCP server allows AI assistants to interact with Lark Bitable databases. This integration enables you to query and analyze data stored in Bitable tables directly through your AI assistant.

Installation

One-Click Installation

The simplest way to install and configure the server is using the one-click installation command:

PERSONAL_BASE_TOKEN=your_personal_base_token APP_TOKEN=your_app_token uv run --with uv --with bitable-mcp bitable-mcp-install

Replace your_personal_base_token and your_app_token with your actual Lark Bitable tokens.

Manual Installation

Prerequisites

Make sure you have uvx installed before proceeding.

For Claude

Add the following to your Claude settings:

Using uvx:

"mcpServers": {
  "bitable-mcp": {
    "command": "uvx",
    "args": ["bitable-mcp"],
    "env": {
        "PERSONAL_BASE_TOKEN": "your-personal-base-token",
        "APP_TOKEN": "your-app-token"
    }
  }
}

Using pip installation:

  1. First install the package:
pip install bitable-mcp
  1. Then add to your Claude settings:
"mcpServers": {
  "bitable-mcp": {
    "command": "python",
    "args": ["-m", "bitable_mcp"],
    "env": {
        "PERSONAL_BASE_TOKEN": "your-personal-base-token",
        "APP_TOKEN": "your-app-token"
    }
  }
}

For Zed

Add to your Zed settings.json:

Using uvx:

"context_servers": [
  "bitable-mcp": {
    "command": "uvx",
    "args": ["bitable-mcp"],
    "env": {
        "PERSONAL_BASE_TOKEN": "your-personal-base-token",
        "APP_TOKEN": "your-app-token"
    }
  }
],

Using pip installation:

"context_servers": {
  "bitable-mcp": {
    "command": "python",
    "args": ["-m", "bitable_mcp"],
    "env": {
        "PERSONAL_BASE_TOKEN": "your-personal-base-token",
        "APP_TOKEN": "your-app-token"
    }
  }
},

Debugging

You can use the MCP inspector to debug the server:

npx @modelcontextprotocol/inspector uvx bitable-mcp

Authentication

You'll need two tokens to use this MCP server:

  1. PERSONAL_BASE_TOKEN: Your personal base token for Lark Bitable
  2. APP_TOKEN: Your application token for Lark Bitable

These tokens should be provided in the environment variables when configuring the MCP server.

Related MCPs

Milvus Vector Database
DatabasesPython

Connect to Milvus vector database for semantic search and vector operations

MotherDuck DuckDB
DatabasesPython

SQL analytics with DuckDB and MotherDuck for AI assistants

Alibaba Cloud Tablestore
DatabasesJava, Python

Connect to Alibaba Cloud Tablestore for vector search and RAG applications

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.