feat: containerize with Docker Compose + Ollama

Add full Docker setup so the app runs with a single 'docker compose up':

- Dockerfile: multi-stage build (node:22-alpine) for the SvelteKit app
- docker-compose.yml: three services:
  1. ollama: runs Ollama server with persistent volume for models
  2. model-init: one-shot container that pulls the configured model
     after Ollama is healthy, then exits
  3. app: the SvelteKit app, starts only after model-init succeeds
- .env.docker: set OLLAMA_MODEL to control which model is pulled
- .dockerignore: keeps image lean
- Switched adapter-auto to adapter-node (required for Docker/Node hosting)
- Updated README with Docker and local dev instructions

Usage:
  docker compose up              # default: llama3
  OLLAMA_MODEL=gemma2 docker compose up  # any Ollama model
This commit is contained in:
2026-04-13 00:22:19 -04:00
parent 11bb42240a
commit 792fafc661
10 changed files with 1191 additions and 422 deletions

3
.env.docker Normal file
View File

@@ -0,0 +1,3 @@
# Model to use with Ollama — change this to any Ollama-compatible model
# Examples: llama3, llama3.1, llama3.2, gemma2, mistral, phi3, qwen2, codellama
OLLAMA_MODEL=llama3