What is An MCP-compliant server that enables AI assistants to access and analyze customer support data from Intercom.?
MCP Server for Intercom is an MCP-compliant server that allows AI assistants to access and analyze customer support data from Intercom. It features advanced filtering for conversations and tickets, efficient server-side filtering via Intercom's search API, and seamless integration with MCP-compliant AI assistants.
Documentation
MCP Server for Intercom
An MCP-compliant server that enables AI assistants to access and analyze customer support data from Intercom.
\n\n## Features
Search conversations and tickets with advanced filtering
Filter by customer, status, date range, and keywords
Search by email content even when no contact exists
Efficient server-side filtering via Intercom's search API
Seamless integration with MCP-compliant AI assistants
\n\n## Installation
Prerequisites
Node.js 18.0.0 or higher
An Intercom account with API access
Your Intercom API token (available in your Intercom account settings)
\n\n### Quick Setup
Using NPM
\n\n# Install the package globally
npm install -g mcp-server-for-intercom
\n\n# Set your Intercom API token
export INTERCOM_ACCESS_TOKEN="your_token_here"
\n\n# Run the server
intercom-mcp
\n\n#### Using Docker
The default Docker configuration is optimized for Glama compatibility:
\n\n# Start Docker (if not already running)
\n\n# On Windows: Start Docker Desktop application
\n\n# On Linux: sudo systemctl start docker
\n\n# Build the image
docker build -t mcp-intercom .
\n\n# Run the container with your API token and port mappings
docker run --rm -it -p 3000:3000 -p 8080:8080 -e INTERCOM_ACCESS_TOKEN="your_token_here" mcp-intercom:latest
Validation Steps:
\n\n# Test the server status
curl -v http://localhost:8080/.well-known/glama.json
\n\n# Test the MCP endpoint
curl -X POST -H "Content-Type: application/json" -d '{"jsonrpc":"2.0","id":1,"method":"mcp.capabilities"}' http://localhost:3000
\n\n##### Alternative Standard Version
If you prefer a lighter version without Glama-specific dependencies:
\n\n# Build the standard image
docker build -t mcp-intercom-standard -f Dockerfile.standard .
\n\n# Run the standard container
docker run --rm -it -p 3000:3000 -p 8080:8080 -e INTERCOM_ACCESS_TOKEN="your_token_here" mcp-intercom-standard:latest
The default version includes specific dependencies and configurations required for integration with the Glama platform, while the standard version is more lightweight.
\n\n## Available MCP Tools
1. list_conversations
Retrieves all conversations within a date range with content filtering.
Parameters:
startDate (DD/MM/YYYY) – Start date (required)
endDate (DD/MM/YYYY) – End date (required)
keyword (string) – Filter to include conversations with this text
exclude (string) – Filter to exclude conversations with this text
Notes:
Date range must not exceed 7 days
Uses efficient server-side filtering via Intercom's search API
For detailed technical information about how this server integrates with Intercom's API, see src/services/INTERCOM_API_NOTES.md. This document explains our parameter mapping, Intercom endpoint usage, and implementation details for developers.
\n\n## Development
\n\n# Clone and install dependencies
git clone https://github.com/raoulbia-ai/mcp-server-for-intercom.git
cd mcp-server-for-intercom
npm install
\n\n# Build and run for development
npm run build
npm run dev
\n\n# Run tests
npm test
\n\n## Disclaimer
This project is an independent integration and is not affiliated with, officially connected to, or endorsed by Intercom Inc. "Intercom" is a registered trademark of Intercom Inc.
\n\n## License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.