Back to MCP Catalog

Wanaku Router MCP Server

Developer ToolsJava
A router for AI-enabled applications powered by the Model Context Protocol (MCP)
Available Tools

router

Routes MCP requests to appropriate LLM providers based on configuration

generate

CLI command to generate connector projects and other components

Wanaku is a powerful router that standardizes how applications provide context to Large Language Models (LLMs) using the Model Context Protocol (MCP). It serves as a central hub for connecting AI-enabled applications, allowing them to communicate effectively with various LLM services. Named after the origins of the word "Guanaco," a camelid native to South America, Wanaku provides a robust infrastructure for building and deploying AI-powered applications.

Getting Started with Wanaku

Wanaku is a Model Context Protocol (MCP) router that enables seamless communication between AI applications and Large Language Models. This guide will help you set up and use Wanaku effectively.

Installation

There are several ways to install and run Wanaku:

Using Docker Compose

The simplest way to get started is using Docker Compose:

  1. Clone the repository:

    git clone https://github.com/wanaku-ai/wanaku.git
    cd wanaku
    
  2. Start Wanaku using Docker Compose:

    docker-compose up
    

This will start the Wanaku router and its dependencies.

Using JBang

Wanaku also supports JBang for quick execution:

  1. Install JBang if you haven't already
  2. Run Wanaku using:
    jbang wanaku@wanaku-ai
    

Building from Source

To build Wanaku from source:

  1. Clone the repository
  2. Build using Maven:
    mvn clean install
    
  3. Run the router:
    java -jar wanaku-router/target/wanaku-router-*.jar
    

Configuration

Wanaku can be configured through various methods:

  1. Environment Variables: Set configuration options using environment variables
  2. Configuration Files: Create a wanaku.properties file in your working directory
  3. Command Line Arguments: Pass options directly when starting Wanaku

Key configuration options include:

  • Server port (default: 8080)
  • Authentication settings
  • Model provider connections
  • Logging levels

Using Wanaku

Once Wanaku is running, you can:

  1. Connect AI Applications: Configure your applications to connect to Wanaku using the MCP protocol
  2. Route Requests: Wanaku will intelligently route requests to appropriate LLM providers
  3. Monitor Activity: Use the built-in UI to monitor traffic and performance

Creating Custom Connectors

Wanaku supports extending its functionality through custom connectors:

  1. Use the provided archetypes to generate a connector project
  2. Implement the connector interface
  3. Build and deploy your connector

Troubleshooting

If you encounter issues:

  1. Check the logs for detailed error messages
  2. Verify your configuration settings
  3. Ensure your network allows connections to the required services
  4. Visit the project's GitHub repository for known issues and solutions

For more detailed documentation, visit the official documentation in the repository's docs directory.

Related MCPs

Apple Shortcuts
Developer ToolsJavaScript

Control Apple Shortcuts automations from AI assistants

Clojars Dependency Lookup
Developer ToolsJavaScript

Fetch dependency information from Clojars, the Clojure community's artifact repository

Simple Timeserver
Developer ToolsPython

Provides Claude with current time and timezone information

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.