Get the latest stable version of a package from a specified registry
Get the latest stable version of an npm package
Get the latest stable version of a Python package from PyPI
Get the latest stable version of a Java package from Maven Central
Get the latest stable version of a Go package
Get the latest stable version of a Swift package
Get the latest stable version of a Docker image from Docker Hub
Get the latest stable version of a container from GitHub Container Registry
Get the latest stable version of a GitHub Action
Get the latest stable version of an AWS Bedrock AI model
Package Version Checker is a specialized tool that helps LLMs recommend up-to-date package versions when writing code. It connects to multiple package registries including npm, PyPI, Maven Central, Go Proxy, Swift Packages, AWS Bedrock, Docker Hub, GitHub Container Registry, and GitHub Actions to retrieve the latest stable versions of packages. This tool solves a common problem with LLMs that may recommend outdated package versions in their code suggestions. By providing real-time access to current package information, it ensures that generated code uses compatible and secure dependencies.
Package Version Checker is an MCP server that provides real-time access to the latest stable package versions from multiple package registries. This helps ensure that when an LLM generates code, it uses current and secure dependencies rather than outdated versions.
The easiest way to run the Package Version Checker is using Docker:
docker run -p 8080:8080 ghcr.io/sammcj/mcp-package-version:latest
You can also use the provided docker-compose file:
# Clone the repository
git clone https://github.com/sammcj/mcp-package-version.git
cd mcp-package-version
# Start the server
docker-compose up -d
If you prefer to build from source, you'll need Go installed:
Clone the repository:
git clone https://github.com/sammcj/mcp-package-version.git
cd mcp-package-version
Build and run:
go build
./mcp-package-version
The server runs on port 8080 by default. You can configure it using environment variables:
PORT
: The port to run the server on (default: 8080)LOG_LEVEL
: Set the logging level (default: info)CACHE_TTL
: Time to live for the cache in seconds (default: 3600)To use this MCP server with your LLM tools, add it to your configuration:
For Claude/Anthropic:
For other LLM tools that support the Model Context Protocol:
http://localhost:8080
(or your custom host/port)If you encounter issues: