What is MCP server for AI assistants enabling cryptocurrency tipping.?
tip.md offers an MCP (Model Context Protocol) server that allows AI assistants like Claude and Cursor to interact with tip.md's crypto tipping functionality directly. It supports multiple blockchains and provides a simple integration method for developers to receive cryptocurrency tips through their GitHub READMEs.
Documentation
TigerGraph-MCP
Custom Discoveries TigerGraph-MCP V3.0 is a community based python Model Context Protocol server that exposes TigerGraph operations (queries, schema, vertices, edges, UDFs) as structured tools, prompts, and URI-based resources for MCP agents. This will allow you to interact with your TigerGraph Database using natural language commands!
To execute the Admin Features, your database user will need to have the database role of 'superuser'. None of the tools will show up in the /tools list if the user isn't assigned the 'superuser' role.
displayService_Status
This tool displays the database service status, show which services are on-line
displayDetailed_Service_Status
This tool provides all the detail that tool displayService_Status with additional information about the Graph to include vertices, edges, and installed queries
displayComponent_Version
This tool will display all the versions of the components that make up TigerGraph Server
displayCPUMemory_Usage
This tool will display both the CPU and Memory Usage of all the components that make up the TigerGraph Server
displayDiskSpace_Usage
This tool will display all the Disk Usage of the different components that make up the TigerGraph database server.
Features++ (Kick-Ass Features include)
We are including a mcp_chatbot that allow you "chat" with the database. You will need to configure
two files. First, you will need to configure the mcp_server/.env file with the LLM you want to use
(default is Anthropic LLM model) Second, you will need to configure the server_config.json file,
under mcp_server/mcp_chatbot folder, specifying the full path to the main.py file.
The chatbot usage is as follows:
@listQueries to see available Query Output Files
@<query_file_name> list content of Query file
/tools to list the available tools under the mcp server
/prompts to list the available prompts under the mcp server
/resources is a planned future enhancment
Custom Discoveries TigerGraph-MCP V2.0 now includes enhanced functionality for exporting
query results to CSV or JSON file formats within a designated output directory. This output
directory can be configured through the mcp_server/.env file, with the default location
set to ./Output.
We have developed some Session Management Features that will help the developer with
handling TigerGraph Session around SECRET & TOKEN Management. In short, TigerGraph session
initialization checks the following:
Determines if the database is running (ping)
Determines if the specified graph in the .env file exists, if not, it will create a graph
Checks to see if there is a Secret registered by the combination of db user name and graph name
(i.e. Secret Alias), if there is no Secret Alias, the system will create a secret and token and
write it out to your .env file
Once the session is authenticated it will set a TigerGraph Connection
Project Structure
TigerGraph-MCP/
├── installUVEnvironment.sh # Intitalizes vitural environment if one exists, if not it will
create one, and load it with the project dependencies in the
requirements.txt file.
├── LICENSE # MIT License
├── main.py # MCP app bootstrap (`run_server()`) Used to start TigerGraph_MCP server
├── chatBot_Main.py # Main python program to invoke mcp_chatbot.py
├── mcp_server
├── .env # TigerGraph (HOST, GRAPH, SECRET) & LLM configuration paramaters
├── config.py # Reads environment config file (.env) and defines System Constants
├── mcp_logger.py # Sets up the log handler and sets Logging Level to ERROR
├── mcp_chatbot
├── mcp_chatbot.py # Chatbot for LLM to interact with TigerGraph MCP Server (uses .env file)
├── server_config.json # Configuration file to define TigerGraph MCP Server
├── tigerGraph
├── interface.py # Interface definitions of client methods
├── mcp_Server.py # `@mcp.tool` and `@mcp.prompts` definitions, exposing client methods & prompts
├── prettyPrintDir.py # Implements pretty print directory functionality
├── services.py # Implement service calls to TigerGraph database
├── session.py # Encapsulates TigerGraphConnection and core session operations
├── Outputs # Output directory where Query outputs are written (.csv or .json format)
├── tests # test directory which containts all the tests cases written
├── README.md # ReadMe file that has all the descriptions of the test cases in the tests directory
├── pyproject.toml # Project metadata & dependencies
├── README.md # This markdown README file
├── requirements.txt # Python package dependencies
├── runChatBot.sh # Unix shell script to run TigerGraph ChatBot
└── .gitignore # Github/OS/Python ignore rules
Installation
Clone the repo
git clone https://github.com/customdiscoveries/TigerGraph_MCP.git
cd TigerGraph-MCP
Create, Install Dependencies & Activate a virtual environment
installUVEnvironment.sh
Configuration Setup
MCP TigerGraph Server
The assumption is that the developer has a instance of TigerGraph
running either in the cloud or on a Linux Desktop. Furthermore, it
is assumed that the TigerGraph instance has a defined dba user
(with valid roles and password).
Please edit your .env configuration (that you copied from the .env-exmaple) file
with the following required attributes to create a TigerGraph session:
TG_HOST=http://localhost
TG_GRAPH=tigerGraph_Graph_name
TG_USERNAME=tigerGraph_dba_user_name
TG_PASSWORD=tigerGraph_dba_user_password
TG_SECRET=(automatically captured if you don't provide one)
TG_TOKEN=(automatically captured if you don't provide one)
TigerGraph MCP ChatBot
The MCP ChatBot currently supports two LLMs, OPEN AI, an Anthropic claude.
You will need to modify the .env (same as used in the mcp server) configuration
file and set the LLM_MODEL_FAMILY attribute.
LLM_MODEL_FAMILY='ANTHROPIC' # Currently supports: 'ANTHROPIC' or 'OPENAI'
ANTHROPIC_LLM_MODEL='claude-3-7-sonnet-20250219'
ANTHROPIC_API_KEY="Your Anthropic Key goes here"
ANTHROPIC_TOKENS=2024
OPENAI_LLM_MODEL='gpt-4.1-mini'
OPENAI_TOKENS=2024
OPENAI_API_KEY="Your Open AI Key goes here"
You will need to update the server_config.json file (that you made a copy of) and
edit the path to the run command with your full path name:
"args": ["run", "/Your full path goes here/MCP-Repository/TigerGraph-MCP/main.py"]
Running Chat Bot
Once you have completed all of your configuration setup. It is now time to see the fruit of your labor!
To run the Chat Bot simply invoke the script runChatBot.sh
. runChatBot.sh
You should see a Welcome Banner:
Welcome to Custom Discoveries TigerGraph MCP Chatbot!
Type your queries or type ['quit'|'exit'] to exit.
Use /tools to list available tools
Use /prompts to list available prompts
Query:
At the Query prompt type /tools to get a list of avaialbe tools to interactive with TigerGraph Graph database.
Windows: %APPDATA%\Claude\claude_desktop_config.json
After configuring claude_desktop_config.json file, restart Claude Desktop and you’ll see your MCP tools available via the hammer 🛠 icon.
Contributing
Fork the repository
Create a feature branch
git checkout -b feature/YourFeature
Commit your changes
git commit -m "Add YourFeature"
Push to branch
git push origin feature/YourFeature
Open a Pull Request
Please ensure all new code is covered by tests and follows PEP-8 style.