What is A unified interface for managing all your MCP servers with built-in playground for testing on any AI model.?
The plugged.in MCP Proxy Server is a powerful middleware that aggregates multiple Model Context Protocol (MCP) servers into a single unified interface. It fetches tool, prompt, and resource configurations from the plugged.in App and intelligently routes requests to the appropriate underlying MCP servers. This proxy enables seamless integration with any MCP client (Claude, Cline, Cursor, etc.) while providing advanced management capabilities through the plugged.in ecosystem.
Documentation
plugged.in MCP Proxy Server
๐ Overview
The plugged.in MCP Proxy Server is a powerful middleware that aggregates multiple Model Context Protocol (MCP) servers into a single unified interface. It fetches tool, prompt, and resource configurations from the plugged.in App and intelligently routes requests to the appropriate underlying MCP servers.
This proxy enables seamless integration with any MCP client (Claude, Cline, Cursor, etc.) while providing advanced management capabilities through the plugged.in ecosystem.
โญ If you find this project useful, please consider giving it a star on GitHub! It helps us reach more developers and motivates us to keep improving.
โจ Key Features# ๐ Core Capabilities
Built-in AI Playground: Test your MCPs instantly with Claude, Gemini, OpenAI, and xAI without any client setup
Universal MCP Compatibility: Works with any MCP client including Claude Desktop, Cline, and Cursor
Multi-Server Support: Connect to STDIO, SSE, and Streamable HTTP MCP servers
Dual Transport Modes: Run proxy as STDIO (default) or Streamable HTTP server
Unified Document Search: Search across all connected servers with built-in RAG capabilities
AI Document Exchange (RAG v2): MCP servers can create and manage documents in your library with full attribution
Notifications from Any Model: Receive real-time notifications with optional email delivery
Multi-Workspace Layer: Switch between different sets of MCP configurations with one click
API-Driven Proxy: Fetches capabilities from plugged.in App APIs rather than direct discovery
Full MCP Support: Handles tools, resources, resource templates, and prompts
Custom Instructions: Supports server-specific instructions formatted as MCP prompts
๐ฏ New in v1.5.0 (RAG v2 - AI Document Exchange)
AI Document Creation: MCP servers can now create documents directly in your library
Full model attribution tracking (which AI created/updated the document)
Version history with change tracking
Content deduplication via SHA-256 hashing
Support for multiple formats: MD, TXT, JSON, HTML, PDF, and more
Advanced Document Search: Enhanced RAG queries with AI filtering
Filter by AI model, provider, date range, tags, and source type
Semantic search with relevance scoring
Automatic snippet generation with keyword highlighting
Support for filtering: ai_generated, upload, or api sources
Document Management via MCP:
Set document visibility: private, workspace, or public
Enhanced Notification System: Bidirectional notification support
Send notifications to plugged.in App
Receive notifications from MCP servers
Mark notifications as read/unread
Delete notifications programmatically
Trending Analytics: Real-time activity tracking
Every tool call is logged and tracked
Contributes to trending server calculations
Usage metrics and popularity insights
Registry Integration: Full support for Registry v2 features
Automatic server discovery from registry
Installation tracking and metrics
Community server support
๐ฆ Features from v1.1.0
Streamable HTTP Support: Full support for downstream MCP servers using Streamable HTTP transport
HTTP Server Mode: Run the proxy as an HTTP server with configurable ports
Flexible Authentication: Optional Bearer token authentication for HTTP endpoints
Session Management: Choose between stateful (session-based) or stateless operation modes
๐ฏ Core Features from v1.0.0
Real-Time Notifications: Track all MCP activities with comprehensive notification support
RAG Integration: Support for document-enhanced queries through the plugged.in App
Inspector Scripts: Automated testing tools for debugging and development
Health Monitoring: Built-in ping endpoint for connection monitoring
๐ง Tool Categories
The proxy provides two distinct categories of tools:
๐ง Static Built-in Tools (Always Available)
These tools are built into the proxy and work without any server configuration:
pluggedin_discover_tools - Smart discovery with caching for instant results
pluggedin_rag_query - RAG v2 search across your documents with AI filtering capabilities
pluggedin_send_notification - Send notifications with optional email delivery
pluggedin_create_document - (Coming Soon) Create AI-generated documents in your library
โก Dynamic MCP Tools (From Connected Servers)
These tools come from your configured MCP servers and can be turned on/off:
Database tools (PostgreSQL, SQLite, etc.)
File system tools
API integration tools
Custom tools from any MCP server
The discovery tool intelligently shows both categories, giving AI models immediate access to all available capabilities.
๐ Discovery Tool Usage
pluggedin_discover_tools()
# Force refresh - shows current tools + runs background discovery
pluggedin_discover_tools({"force_refresh": true})
# Discover specific server
pluggedin_discover_tools({"server_uuid": "uuid-here"})
Example Response:
1. **pluggedin_discover_tools** - Smart discovery with caching
2. **pluggedin_rag_query** - RAG v2 search across documents with AI filtering
3. **pluggedin_send_notification** - Send notifications
4. **pluggedin_create_document** - (Coming Soon) Create AI-generated documents
## โก Dynamic MCP Tools (8) - From Connected Servers:
1. **query** - Run read-only SQL queries
2. **generate_random_integer** - Generate secure random integers
...
๐ RAG v2 Usage Examples
The enhanced RAG v2 system allows MCP servers to create and search documents with full AI attribution:
In stateful mode (default), use the mcp-session-id header to maintain sessions:
curl -X POST http://localhost:12006/mcp \
- H "Content-Type: application/json" \
- H "Accept: application/json, text/event-stream" \
- d '{"jsonrpc":"2.0","method":"tools/list","id":1}'
# Subsequent requests use the same session
curl -X POST http://localhost:12006/mcp \
- H "Content-Type: application/json" \
- H "Accept: application/json, text/event-stream" \
- H "mcp-session-id: YOUR_SESSION_ID" \
- d '{"jsonrpc":"2.0","method":"tools/call","params":{"name":"tool_name"},"id":2}'
Authentication
When using --require-api-auth, include your API key as a Bearer token:
curl -X POST http://localhost:12006/mcp \
- H "Authorization: Bearer YOUR_API_KEY" \
- H "Content-Type: application/json" \
- H "Accept: application/json, text/event-stream" \
- d '{"jsonrpc":"2.0","method":"ping","id":1}'
๐ณ Docker Usage
You can also build and run the proxy server using Docker.
Building the Image
Ensure you have Docker installed and running. Navigate to the pluggedin-mcp directory and run:
docker build -t pluggedin-mcp-proxy:latest .
A .dockerignore file is included to optimize the build context.
Running the Container## STDIO Mode (Default)
Run the container in STDIO mode for MCP Inspector testing:
docker run -it --rm \
- e PLUGGEDIN_API_KEY="YOUR_API_KEY" \
- e PLUGGEDIN_API_BASE_URL="YOUR_API_BASE_URL" \
- -name pluggedin-mcp-container \
pluggedin-mcp-proxy:latest
Streamable HTTP Mode
Run the container as an HTTP server:
docker run -d --rm \
- e PLUGGEDIN_API_KEY="YOUR_API_KEY" \
- e PLUGGEDIN_API_BASE_URL="YOUR_API_BASE_URL" \
- p 12006:12006 \
- -name pluggedin-mcp-http \
pluggedin-mcp-proxy:latest \
- -transport streamable-http --port 12006
Replace YOUR_API_KEY and YOUR_API_BASE_URL (if not using the default https://plugged.in).
Testing with MCP Inspector
While the container is running, you can connect to it using the MCP Inspector: