What is Open source email app to reach inbox zero fast.?
Inbox Zero is an AI email assistant that helps you spend less time on email. It consists of an AI email assistant and an open source AI email client. The project encourages contributions, especially to the email client.
Documentation
About
There are two parts to Inbox Zero:
An AI email assistant that helps you spend less time on email.
Open source AI email client.
If you're looking to contribute to the project, the email client is the best place to do this.
Features
AI Personal Assistant: Manages your email for you based on a plain text prompt file. It can take any action a human assistant can take on your behalf (Draft reply, Label, Archive, Reply, Forward, Mark Spam, and even call a webhook).
Reply Zero: Track emails that need your reply and those awaiting responses.
We offer a hosted version of Inbox Zero at https://getinboxzero.com. To self-host follow the steps below.
Setup
Here's a video on how to set up the project. It covers the same steps mentioned in this document. But goes into greater detail on setting up the external services.
Go to "App registrations" in the left sidebar or search it in the searchbar
Click "New registration"
Choose a name for your application
Under "Supported account types" select "Accounts in any organizational directory (Any Azure AD directory - Multitenant) and personal Microsoft accounts (e.g. Skype, Xbox)"
Set the Redirect URI:
Platform: Web
URL: http://localhost:3000/api/auth/callback/microsoft-entra-id
4. Click "Register"
5. In the "Manage" menu click "Authentication (Preview)"
6. Add the Redirect URI: http://localhost:3000/api/outlook/linking/callback
Get your credentials:
The "Application (client) ID" shown is your MICROSOFT_CLIENT_ID
To get your client secret:
Click "Certificates & secrets" in the left sidebar
Click "New client secret"
Add a description and choose an expiry
Click "Add"
Copy the secret Value (not the ID) - this is your MICROSOFT_CLIENT_SECRET
Configure API permissions:
In the "Manage" menu click "API permissions" in the left sidebar
Note: If you need to access Ollama hosted locally and the application is running on Docker setup, you can use http://host.docker.internal:11434/api as the base URL. You might also need to set OLLAMA_HOST to 0.0.0.0 in the Ollama configuration file.
You can select the model you wish to use in the app on the /settings page of the app.
If you are using local ollama, you can set it to be default:
DEFAULT_LLM_PROVIDER=ollama
If this is the case you must also set the ECONOMY_LLM_PROVIDER environment variable.
Redis and Postgres
We use Postgres for the database.
For Redis, you can use Upstash Redis or set up your own Redis instance.
You can run Postgres & Redis locally using docker-compose
docker-compose up -d # -d will run the services in the background
Running the app
To run the migrations:
pnpm prisma migrate dev
To run the app locally for development (slower):
pnpm run dev
Or from the project root:
turbo dev
To build and run the app locally in production mode (faster):
Many features are available only to premium users. To upgrade yourself, make yourself an admin in the .env: [email protected]
Then upgrade yourself at: http://localhost:3000/admin.
Set up push notifications via Google PubSub to handle emails in real time
Set env var GOOGLE_PUBSUB_TOPIC_NAME.
When creating the subscription select Push and the url should look something like: https://www.getinboxzero.com/api/google/webhook?token=TOKEN or https://abc.ngrok-free.app/api/google/webhook?token=TOKEN where the domain is your domain. Set GOOGLE_PUBSUB_VERIFICATION_TOKEN in your .env file to be the value of TOKEN.
To run in development ngrok can be helpful:
ngrok http 3000\n\n# or with an ngrok domain to keep your endpoint stable (set `XYZ`):
ngrok http --domain=XYZ.ngrok-free.app 3000
Here are some easy ways to run cron jobs. Upstash is a free, easy option. I could never get the Vercel vercel.json. Open to PRs if you find a fix for that.
Contributing to the project
You can view open tasks in our GitHub Issues.
Join our Discord to discuss tasks and check what's being worked on.
ARCHITECTURE.md explains the architecture of the project (LLM generated).