Back to MCP Catalog

watsonx.ai Flows Engine MCP Server

Developer ToolsTypeScript
Access IBM watsonx.ai Flows Engine tools through MCP
Available Tools

google_books

Search for books and retrieve information about them

booksbook

wikipedia

Search Wikipedia and retrieve page content

searchpage

watsonx.ai Flows Engine MCP server provides seamless integration with IBM's AI tools ecosystem. It enables access to powerful tools like Google Books and Wikipedia search directly through the Model Context Protocol, allowing AI assistants to retrieve real-time information and enhance their responses with external data. This integration bridges the gap between large language models and specialized data sources.

Overview

The watsonx.ai Flows Engine MCP server allows you to connect AI assistants to IBM's watsonx.ai Flows Engine tools. This integration enables AI models to access external data sources and services through a standardized protocol, enhancing their capabilities with real-time information retrieval.

Prerequisites

Before setting up the watsonx.ai Flows Engine MCP server, you'll need:

  1. A free watsonx.ai Flows Engine account (sign up at https://ibm.biz/wxflows)
  2. Node.js installed on your system
  3. Git to clone the repository

Installation Steps

Step 1: Clone the Repository

git clone https://github.com/IBM/wxflows.git
cd wxflows/examples/mcp/javascript

Step 2: Set Up watsonx.ai Flows Engine

  1. Sign up for a free account at https://ibm.biz/wxflows
  2. Download and install the Node.js CLI following the instructions at https://wxflows.ibm.stepzen.com/docs/installation
  3. Authenticate your account using the CLI

Step 3: Deploy a Flows Engine Project

Navigate to the wxflows directory in the cloned repository:

cd wxflows

Deploy the pre-configured tools to a Flows Engine endpoint:

wxflows deploy

This will deploy the endpoint and tools that will be used by the wxflows SDK in your application.

Step 4: Configure Environment Variables

From the project's root directory, create your environment file:

cp .env.sample .env

Edit the .env file to add your credentials, including your watsonx.ai Flows Engine API key and endpoint.

Step 5: Install Dependencies

Install the necessary dependencies for the application:

npm i

Step 6: Build the MCP Server

Build the server by running:

npm run build

Step 7: Configure Your MCP Client

To use with Claude Desktop or other MCP clients, add the server configuration to your client's settings.

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. The repository includes the MCP Inspector for debugging purposes:

npm run inspector

This will provide a URL to access debugging tools in your browser.

Support

For questions or feedback, join the watsonx.ai Flows Engine Discord community at https://ibm.biz/wxflows-discord.

Related MCPs

Apple Shortcuts
Developer ToolsJavaScript

Control Apple Shortcuts automations from AI assistants

Clojars Dependency Lookup
Developer ToolsJavaScript

Fetch dependency information from Clojars, the Clojure community's artifact repository

Simple Timeserver
Developer ToolsPython

Provides Claude with current time and timezone information

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.