Massive Context Mcp

Massive Context Mcp

Handle 10M+ token contexts for LLMs with local or cloud inference.

Visit Massive Context Mcp

About Massive Context Mcp

Massive Context MCP is an open-source Model Context Protocol (MCP) server that enables AI assistants like Claude, Cursor, and Windsurf to handle extremely large contexts (10 million+ tokens) efficiently. It achieves this via advanced chunking, recursive sub-querying, and can provide inference using local models with Ollama or cloud models when needed. It exposes a suite of tools that allow for context loading, inspection, strategic chunking, filtering, parallel analysis, and programmatic code execution directly against the loaded context. This makes it highly valuable for developers, researchers, data analysts, and anyone needing deep contextual analysis on massive datasets or codebases, particularly on macOS and Python environments.

Pricing
Open Source

Resources

Product Website

Visit Massive Context Mcp's official website for product details and getting started.

Visit website →