Local AI, zero compromise
Your AI. Your hardware.
No one else.
Open Accountant runs small language models on your machine via Ollama. Your bank data never leaves your laptop. No API keys. No subscriptions. No cloud.
Why local AI?
Every cloud AI service reads your data. We took a different path.
Privacy
Your transactions stay on your machine. No API calls, no third-party servers, no data sharing agreements to read. Your finances are nobody's business.
Speed
No network latency. Responses come as fast as your hardware can produce them. Categorize 200 transactions in seconds, not minutes.
Cost
Zero API fees, forever. Run as many queries as you want. No per-token billing. No surprise invoices. The model is free, the inference is free.
Tested & recommended
Models that work
Small language models that run on a laptop and handle financial tasks well.
| Model | Size | RAM | Quality | Best for |
|---|---|---|---|---|
| llama3.2:3b | 2.0 GB | 8 GB | Good | Daily categorization |
| llama3.1:8b | 4.7 GB | 16 GB | Great | Financial analysis |
| phi-4-mini | 2.2 GB | 8 GB | Good | Low-RAM machines |
| mistral | 4.1 GB | 16 GB | Great | Balanced all-rounder |
| gemma3:4b | 3.3 GB | 8 GB | Good | Summaries & categorization |
| qwen2.5:7b | 4.4 GB | 16 GB | Great | Multilingual data |
| deepseek-r1:8b | 4.9 GB | 16 GB | Great | Complex reasoning |
3 commands
Running in 2 minutes
Install Ollama, pull a model, and point Open Accountant at it. That's it. No account setup. No API keys. No billing dashboard.
1. Install Ollama via Homebrew (or curl on Linux)
2. Pull a model — we recommend llama3.2 to start
3. Set DEFAULT_MODEL in your .env file
$ brew install ollama
> ollama 0.6.2 installed
$ ollama pull llama3.2
> pulling llama3.2:latest... 100%
> success
$ echo 'DEFAULT_MODEL=ollama:llama3.2' >> .env
$ oa categorize --auto
> Categorizing via Ollama (llama3.2)...
> 147/147 categorized. 94% confidence.
$
Your money. Your machine.
Install Open Accountant, pull an Ollama model, and take control of your finances — without giving anyone else access.