MI

Minima

Created 6 months ago

Minima is an open source RAG on-premises containers, integrating with ChatGPT and MCP.

development location documentation public MCP

What is Minima?

MCP server for RAG on local files

Documentation

Running as Containers

  1. Create a .env file in the project’s root directory (where you’ll find env.sample). Place .env in the same folder and copy all environment variables from env.sample to .env.
  2. Ensure your .env file includes the following variables: LOCAL_FILES_PATH EMBEDDING_MODEL_ID EMBEDDING_SIZE OLLAMA_MODEL RERANKER_MODEL USER_ID - required for ChatGPT integration, just use your email PASSWORD - required for ChatGPT integration, just use any password
  3. For fully local installation use: docker compose -f docker-compose-ollama.yml --env-file .env up --build.
  4. For ChatGPT enabled installation use: docker compose -f docker-compose-chatgpt.yml --env-file .env up --build.
  5. For MCP integration (Anthropic Desktop app usage): docker compose -f docker-compose-mcp.yml --env-file .env up --build.
  6. In case of ChatGPT enabled installation copy OTP from terminal where you launched docker and use Minima GPT
  7. If you use Anthropic Claude, just add the following to /Library/Application\ Support/Claude/claude_desktop_config.json { "mcpServers": { "minima": { "command": "uv", "args": [ "--directory", "/path_to_cloned_minima_project/mcp-server", "run", "minima" ] } } } 8. To use fully local installation go to cd electron, then run npm install and npm start which will launch Minima electron app. 9. Ask anything, and you'll get answers based on local files in {LOCAL_FILES_PATH} folder.

Variables Explained

LOCAL_FILES_PATH: Specify the root folder for indexing (on your cloud or local pc). Indexing is a recursive process, meaning all documents within subfolders of this root folder will also be indexed. Supported file types: .pdf, .xls, .docx, .txt, .md, .csv. EMBEDDING_MODEL_ID: Specify the embedding model to use. Currently, only Sentence Transformer models are supported. Testing has been done with sentence-transformers/all-mpnet-base-v2, but other Sentence Transformer models can be used. EMBEDDING_SIZE: Define the embedding dimension provided by the model, which is needed to configure Qdrant vector storage. Ensure this value matches the actual embedding size of the specified EMBEDDING_MODEL_ID. OLLAMA_MODEL: Set up the Ollama model, use an ID available on the Ollama site. Please, use LLM model here, not an embedding. RERANKER_MODEL: Specify the reranker model. Currently, we have tested with BAAI rerankers. You can explore all available rerankers using this link. USER_ID: Just use your email here, this is needed to authenticate custom GPT to search in your data. PASSWORD: Put any password here, this is used to create a firebase account for the email specified above.

Examples

Example of .env file for on-premises/local usage: LOCAL_FILES_PATH=/Users/davidmayboroda/Downloads/PDFs/ EMBEDDING_MODEL_ID=sentence-transformers/all-mpnet-base-v2 EMBEDDING_SIZE=768 OLLAMA_MODEL=qwen2:0.5b # must be LLM model id from Ollama models page RERANKER_MODEL=BAAI/bge-reranker-base # please, choose any BAAI reranker model To use a chat ui, please navigate to http://localhost:3000 Example of .env file for Claude app: LOCAL_FILES_PATH=/Users/davidmayboroda/Downloads/PDFs/ EMBEDDING_MODEL_ID=sentence-transformers/all-mpnet-base-v2 EMBEDDING_SIZE=768 For the Claude app, please apply the changes to the claude_desktop_config.json file as outlined above. To use MCP with GitHub Copilot: 1. Create a .env file in the project’s root directory (where you’ll find env.sample). Place .env in the same folder and copy all environment variables from env.sample to .env. 2. Ensure your .env file includes the following variables: - LOCAL_FILES_PATH - EMBEDDING_MODEL_ID - EMBEDDING_SIZE 3. Create or update the .vscode/mcp.json with the following configuration: json { "servers": { "minima": { "type": "stdio", "command": "path_to_cloned_minima_project/run_in_copilot.sh", "args": [ "path_to_cloned_minima_project" ] } } } Example of .env file for ChatGPT custom GPT usage: LOCAL_FILES_PATH=/Users/davidmayboroda/Downloads/PDFs/ EMBEDDING_MODEL_ID=sentence-transformers/all-mpnet-base-v2 EMBEDDING_SIZE=768 [email protected] # your real email PASSWORD=password # you can create here password that you want Also, you can run minima using run.sh.

Installing via Smithery (MCP usage)

To install Minima for Claude Desktop automatically via Smithery: bash npx -y @smithery/cli install minima --client claude **For MCP usage, please be sure that your local machines python is >=3.10 and 'uv' installed.

Server Config

{
  "mcpServers": {
    "minima-server": {
      "command": "npx",
      "args": [
        "minima"
      ]
    }
  }
}

Links & Status

Repository: github.com
Hosted: No
Global: No
Official: Yes

Project Info

Hosted Featured
Created At: May 23, 2025
Updated At: Aug 07, 2025
Author: dmayboroda
Category: community
License: Mozilla Public License v2.0 (MPLv2)
Tags:
development location documentation