LC

Lambda Capture

Created 5 months ago

MCP implementation of the Semantic Search API for Macroeconomic Data.

development location documentation public

What is Lambda Capture?

Macroeconomic Forecasts & Semantic Context from Federal Reserve, Bank of England, ECB.

Documentation

Lambda Capture MCP Server

Remote MCP Server (streamable HTTP)

Check server status HERE

OpenAI Responses API

from openai import OpenAI
client = OpenAI()
resp = client.responses.create(
    model="gpt-4.1",
    input="Key shifts in inflation expectations",
    tools=[
        {
            "type": "mcp",
            "server_label": "lambda-capture",
            "server_url": "https://mcp.lambda-capture.com/v1/mcp/",
            "headers": {
                "Authorization": "Bearer YOUR_ACCESS_TOKEN"
            }
        }
    ]
)
print(resp.output_text)

Curl

curl -X POST "https://mcp.lambda-capture.com/v1/mcp/" \
- H "Content-Type: application/json" \
- H "Accept: application/json, text/event-stream" \
- H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
- d '{ "jsonrpc": "2.0", "method": "tools/call", "id": 1, "params": { "name": "macroecon_semantic_search", "arguments": { "query_text": "inflation expectations", "max_results": 3 } } }'

Configure your MCP Client (Claude Desktop App)

Go to Claude -> Settings -> Developer -> Edit Config. Add the following to your claude_desktop_config.json

Node:

{
    "mcpServers": {
        "lambda-capture-mcp": {
            "command": "npx",
            "args": [ "mcp-remote", "https://mcp.lambda-capture.com/v1/mcp/", "--header", "Authorization: Bearer YOUR_ACCESS_TOKEN" ],
            "description": "RemoteMCP with Lambda Capture Macroeconomic Data API"
        }
    }
}

Local MCP Server# Pre-requisites

Installation

  1. Clone the repo

Node:

  1. npm install to install the dependencies
  2. npm run build to build the project

Python:

  1. python -m venv .venv create virtual environment
  2. source .venv/bin/activate activate virtual environment
  3. pip install -r requirements.txt install the dependencies

Context Window Size

Adjust maxTokens (.ts) or max_tokens (.py) variables, based on context window size of your model (doesn't count metadata, just content tokens)

Server Config

{
  "mcpServers": {
    "lambda-capture-server": {
      "command": "npx",
      "args": [
        "lambda-capture"
      ]
    }
  }
}

Links & Status

Repository: github.com
Hosted: Yes
Global: Yes
Official: Yes

Project Info

Hosted Featured
Created At: Jul 02, 2025
Updated At: Aug 07, 2025
Author: Lambda Capture Limited
Category: official
License: © 2025 Lambda Capture Limited (Registration Number 15845351)
Tags:
development location documentation