Vijayakanth Manoharan 3e6231b654 Teach local AI from user category corrections
- Add MerchantCorrection model: upsert by merchantName, Category enum
- Check corrections DB first in suggestCategoryForMerchant (source: "learned",
  no confirmation required); falls through to rules then Ollama if no match
- Inject recent corrections as few-shot examples in the Ollama prompt so the
  model improves even for merchants not yet explicitly corrected
- Add POST /categories/correct route to persist corrections
- Detect category override on form save (suggestedCategory !== chosen category)
  and silently fire a correction — no extra UX required
- Fix test isolation: beforeEach re-applies vi.fn() defaults after restoreAllMocks

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-23 17:28:26 -04:00
2026-03-23 12:37:05 -04:00

Monthy Tracker

Private monthly expense tracking with local-first storage, offline category suggestions, and offline monthly insights via Ollama.

Local app

  1. Install dependencies:
npm install
  1. Create env config from .env.example and keep your local runtime settings:
cp .env.example .env
  1. Apply migrations and start the app:
npx prisma migrate deploy
npm run dev
  1. Keep Ollama running with the configured model:
ollama serve
ollama pull qwen3.5:9b

Docker Compose

Run the app in Docker while keeping Ollama on the host:

docker compose up --build

This compose stack will:

  • start only the Next.js app on http://localhost:3000
  • keep SQLite data in a named Docker volume
  • connect to host Ollama through host.docker.internal

Before running Docker Compose, make sure host Ollama is already up:

ollama serve
ollama pull qwen3.5:9b

If you run the app outside Docker, keep using:

OLLAMA_URL=http://127.0.0.1:11434/

In-app helpers

  • Use the dashboard runtime panel to refresh Ollama status.
  • If the configured model is missing, use Pull configured model from the UI.
  • Use Download backup to export the current SQLite database file.

Environment

  • DATABASE_URL - Prisma SQLite connection string
  • OLLAMA_URL - Ollama base URL; in Docker Compose this defaults to http://host.docker.internal:11434/
  • OLLAMA_MODEL - selected model tag, default qwen3.5:9b
Description
No description provided
Readme 379 KiB
Languages
TypeScript 98.8%
CSS 0.4%
Dockerfile 0.4%
JavaScript 0.4%