Back to MCP Catalog

Trino SQL Query Engine MCP Server

DatabasesPython
Query and explore Trino databases using SQL
Available Tools

list_tables

Lists all available tables in the configured Trino catalog and schema

read_table

Reads and returns the contents of a specified Trino table

execute_sql

Executes arbitrary SQL queries against the Trino database

The Trino MCP Server provides a seamless interface for AI models to interact with Trino, a fast distributed SQL query engine for big data analytics. This integration allows you to list available tables, explore their contents, and execute arbitrary SQL queries against your Trino databases, making your data accessible for AI-powered analysis and insights.

Introduction

The Trino MCP Server enables AI models to interact with Trino databases through the Model-Control-Protocol (MCP). This server provides capabilities to list available tables, read table contents, and execute SQL queries against your Trino instance.

Installation

To use the Trino MCP Server, you'll need to have Python 3.9 or later installed on your system. The server requires the trino Python driver and the mcp library.

  1. Clone the repository:
git clone https://github.com/Dataring-engineering/mcp-server-trino.git
cd mcp-server-trino
  1. Install the required dependencies:
pip install -r requirements.txt

Configuration

The Trino MCP Server requires configuration to connect to your Trino instance. You'll need to set the following environment variables:

  • TRINO_HOST: Hostname or IP address of your Trino server (defaults to localhost)
  • TRINO_PORT: Port number for your Trino server (defaults to 8080)
  • TRINO_USER: Username for authentication (required)
  • TRINO_PASSWORD: Password for authentication (optional, depends on your setup)
  • TRINO_CATALOG: Default catalog to use (required, e.g., hive, tpch, postgresql)
  • TRINO_SCHEMA: Default schema to use (required, e.g., default, public)

Usage

To use the Trino MCP Server with an AI assistant, you'll need to add it to your configuration. The server provides access to Trino tables as resources and offers tools for executing SQL queries.

When interacting with the AI assistant, you can:

  1. Ask for a list of available tables in your configured catalog and schema
  2. Request information about specific tables, including their structure and sample data
  3. Execute SQL queries to analyze your data

The AI assistant will use the Trino MCP Server to fetch the requested information and present it to you in a readable format.

Examples

Here are some example prompts you can use with an AI assistant that has the Trino MCP Server configured:

  • "Show me all available tables in my Trino database"
  • "What columns are in the customers table?"
  • "Run a query to find the top 5 products by sales"
  • "Show me a sample of data from the orders table"

The AI assistant will execute these requests through the Trino MCP Server and return the results.

Related MCPs

Milvus Vector Database
DatabasesPython

Connect to Milvus vector database for semantic search and vector operations

MotherDuck DuckDB
DatabasesPython

SQL analytics with DuckDB and MotherDuck for AI assistants

Alibaba Cloud Tablestore
DatabasesJava, Python

Connect to Alibaba Cloud Tablestore for vector search and RAG applications

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.