Skip to content

Local AI, zero compromise

Your AI. Your hardware.
No one else.

Open Accountant runs small language models on your machine via Ollama. Your bank data never leaves your laptop. No API keys. No subscriptions. No cloud.

Why local AI?

Every cloud AI service reads your data. We took a different path.

Privacy

Your transactions stay on your machine. No API calls, no third-party servers, no data sharing agreements to read. Your finances are nobody's business.

Speed

No network latency. Responses come as fast as your hardware can produce them. Categorize 200 transactions in seconds, not minutes.

Cost

Zero API fees, forever. Run as many queries as you want. No per-token billing. No surprise invoices. The model is free, the inference is free.

Tested & recommended

Models that work

Small language models that run on a laptop and handle financial tasks well.

ModelSizeRAMQualityBest for
llama3.2:3b2.0 GB8 GBGoodDaily categorization
llama3.1:8b4.7 GB16 GBGreatFinancial analysis
phi-4-mini2.2 GB8 GBGoodLow-RAM machines
mistral4.1 GB16 GBGreatBalanced all-rounder
gemma3:4b3.3 GB8 GBGoodSummaries & categorization
qwen2.5:7b4.4 GB16 GBGreatMultilingual data
deepseek-r1:8b4.9 GB16 GBGreatComplex reasoning

3 commands

Running in 2 minutes

Install Ollama, pull a model, and point Open Accountant at it. That's it. No account setup. No API keys. No billing dashboard.

1. Install Ollama via Homebrew (or curl on Linux)

2. Pull a model — we recommend llama3.2 to start

3. Set DEFAULT_MODEL in your .env file

terminal

$ brew install ollama

> ollama 0.6.2 installed

$ ollama pull llama3.2

> pulling llama3.2:latest... 100%

> success

$ echo 'DEFAULT_MODEL=ollama:llama3.2' >> .env

$ oa categorize --auto

> Categorizing via Ollama (llama3.2)...

> 147/147 categorized. 94% confidence.

$

Your money. Your machine.

Install Open Accountant, pull an Ollama model, and take control of your finances — without giving anyone else access.