feat: containerize with Docker Compose + Ollama
Add full Docker setup so the app runs with a single 'docker compose up':
- Dockerfile: multi-stage build (node:22-alpine) for the SvelteKit app
- docker-compose.yml: three services:
1. ollama: runs Ollama server with persistent volume for models
2. model-init: one-shot container that pulls the configured model
after Ollama is healthy, then exits
3. app: the SvelteKit app, starts only after model-init succeeds
- .env.docker: set OLLAMA_MODEL to control which model is pulled
- .dockerignore: keeps image lean
- Switched adapter-auto to adapter-node (required for Docker/Node hosting)
- Updated README with Docker and local dev instructions
Usage:
docker compose up # default: llama3
OLLAMA_MODEL=gemma2 docker compose up # any Ollama model
This commit is contained in:
3
.env.docker
Normal file
3
.env.docker
Normal file
@@ -0,0 +1,3 @@
|
||||
# Model to use with Ollama — change this to any Ollama-compatible model
|
||||
# Examples: llama3, llama3.1, llama3.2, gemma2, mistral, phi3, qwen2, codellama
|
||||
OLLAMA_MODEL=llama3
|
||||
Reference in New Issue
Block a user