Search the bible reliably and repeatably [ai-Bible Labs](https://ai-bible.com)
Documentation
ai-Bible
mcp-server for Claude etc
The mcp-server contains the current implementation of a server for repeatedly and reliably retrieving bible verses when using LLMs. Claude Desktop can be configured to use the mcp-server.stdio.js file built in the build folder of this project as an mcp-server. See the README.md in that subfolder for detailed information.
docker-container for completions
The docker container wraps the mcp server up using mcpo in order to turn it into server supporting the openai completions api. Run these commands from the project root after building the mcp-server.
One way to access the completions api is via Open WebUI and then you can do everything locally with a LLM via Ollama with a model such as llama 3.1 8b, see: