Provides Google Vertex AI Search results by grounding a Gemini model with your own private data
Documentation
MCP Server for Vertex AI Search
This is a MCP server to search documents using Vertex AI.
Architecture
This solution uses Gemini with Vertex AI grounding to search documents using your private data. Grounding improves the quality of search results by grounding Gemini's responses in your data stored in Vertex AI Datastore. We can integrate one or multiple Vertex AI data stores to the MCP server. For more details on grounding, refer to Vertex AI Grounding Documentation.
How to use
There are two ways to use this MCP server. If you want to run this on Docker, the first approach would be good as Dockerfile is provided in the project.
1. Clone the repository
git clone [email protected]:ubie-oss/mcp-vertexai-search.git\n\n# Create a virtual environment
uv venv\n\n# Install the dependencies
uv sync --all-extras\n\n# Check the command
uv run mcp-vertexai-search
Install the python package
The package isn't published to PyPI yet, but we can install it from the repository. We need a config file derives from config.yml.template to run the MCP server, because the python package doesn't include the config template. Please refer to Appendix A: Config file for the details of the config file.
pip install git+https://github.com/ubie-oss/mcp-vertexai-search.git\n\n# Check the command
mcp-vertexai-search --help
This supports two transports for SSE (Server-Sent Events) and stdio (Standard Input Output). We can control the transport by setting the --transport flag. We can configure the MCP server with a YAML file. config.yml.template is a template for the config file. Please modify the config file to fit your needs.