Searches the web for up-to-date information using DuckDuckGo and returns relevant context from top results
Local RAG Web Search provides a lightweight, privacy-focused solution for retrieving up-to-date information from the web without relying on external APIs. It uses DuckDuckGo to search the web, extracts relevant content from search results, and ranks them using Google's MediaPipe Text Embedder to provide the most relevant context to your LLM. This tool enables your AI assistant to access current information beyond its training cutoff, enhancing responses with fresh knowledge while maintaining privacy by running entirely on your local machine. The implementation is intentionally "primitive" but effective, making it accessible and easy to deploy.
Local RAG Web Search is a Model Context Protocol (MCP) server that enables your AI assistant to search the web for up-to-date information. It implements a Retrieval-Augmented Generation (RAG) approach that:
All of this happens locally on your machine without requiring any external API keys.
You can install Local RAG Web Search using one of the following methods:
This is the simplest approach if you have Docker installed:
{
"mcpServers": {
"mcp-local-rag": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"--init",
"-e",
"DOCKER_CONTAINER=true",
"ghcr.io/nkapila6/mcp-local-rag:latest"
]
}
}
}
This approach requires uv to be installed on your system.
Add the following to your MCP configuration:
{
"mcpServers": {
"mcp-local-rag":{
"command": "uvx",
"args": [
"--python=3.10",
"--from",
"git+https://github.com/nkapila6/mcp-local-rag",
"mcp-local-rag"
]
}
}
}
git clone https://github.com/nkapila6/mcp-local-rag
{
"mcpServers": {
"mcp-local-rag": {
"command": "uv",
"args": [
"--directory",
"/path/to/mcp-local-rag/",
"run",
"src/mcp_local_rag/main.py"
]
}
}
}
Replace /path/to/mcp-local-rag/
with the actual path where you cloned the repository.
Once installed, your AI assistant will automatically use Local RAG Web Search when it needs to retrieve current information from the web.
When you ask a question that requires up-to-date information (beyond the model's training data), the assistant will recognize the need to search the web and will use this tool to fetch relevant information.
You don't need to explicitly invoke the tool - the AI assistant will determine when web search is necessary and will use the tool automatically.
The location of your MCP configuration file depends on your operating system:
%APPDATA%\mcp\config.json
~/Library/Application Support/mcp/config.json
~/.config/mcp/config.json
For more information about MCP configuration, visit https://modelcontextprotocol.io/quickstart/user.