feat: containerize with Docker Compose + Ollama
Add full Docker setup so the app runs with a single 'docker compose up':
- Dockerfile: multi-stage build (node:22-alpine) for the SvelteKit app
- docker-compose.yml: three services:
1. ollama: runs Ollama server with persistent volume for models
2. model-init: one-shot container that pulls the configured model
after Ollama is healthy, then exits
3. app: the SvelteKit app, starts only after model-init succeeds
- .env.docker: set OLLAMA_MODEL to control which model is pulled
- .dockerignore: keeps image lean
- Switched adapter-auto to adapter-node (required for Docker/Node hosting)
- Updated README with Docker and local dev instructions
Usage:
docker compose up # default: llama3
OLLAMA_MODEL=gemma2 docker compose up # any Ollama model
This commit is contained in:
67
README.md
67
README.md
@@ -1,42 +1,61 @@
|
||||
# sv
|
||||
# English Style Converter
|
||||
|
||||
Everything you need to build a Svelte project, powered by [`sv`](https://github.com/sveltejs/cli).
|
||||
A SvelteKit web app that converts English text into various styles and tones using an LLM.
|
||||
|
||||
## Creating a project
|
||||
## Quick Start (Docker)
|
||||
|
||||
If you're seeing this, you've probably already done this step. Congrats!
|
||||
```bash
|
||||
# Option 1: Use default model (llama3)
|
||||
docker compose up
|
||||
|
||||
```sh
|
||||
# create a new project
|
||||
npx sv create my-app
|
||||
# Option 2: Choose a different model
|
||||
OLLAMA_MODEL=gemma2 docker compose up
|
||||
```
|
||||
|
||||
To recreate this project with the same configuration:
|
||||
- **App:** http://localhost:3000
|
||||
- **Ollama API:** http://localhost:11434
|
||||
|
||||
```sh
|
||||
# recreate this project
|
||||
npx sv@0.15.1 create --template minimal --types ts --no-install .
|
||||
First startup pulls the model from Ollama, which may take a few minutes depending on model size and your connection. Subsequent starts are instant (model is cached in a Docker volume).
|
||||
|
||||
To change the model later, edit `.env.docker` and run:
|
||||
|
||||
```bash
|
||||
docker compose down
|
||||
docker compose up --build
|
||||
```
|
||||
|
||||
## Developing
|
||||
## Local Development (without Docker)
|
||||
|
||||
Once you've created a project and installed dependencies with `npm install` (or `pnpm install` or `yarn`), start a development server:
|
||||
Prerequisites: [Ollama](https://ollama.ai) running locally with a model pulled.
|
||||
|
||||
```sh
|
||||
```bash
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Copy env config (defaults to Ollama at localhost:11434)
|
||||
cp .env.example .env
|
||||
|
||||
# Start dev server
|
||||
npm run dev
|
||||
|
||||
# or start the server and open the app in a new browser tab
|
||||
npm run dev -- --open
|
||||
```
|
||||
|
||||
## Building
|
||||
## Configuration
|
||||
|
||||
To create a production version of your app:
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `OPENAI_BASE_URL` | `http://localhost:11434/v1` | OpenAI-compatible API endpoint |
|
||||
| `OPENAI_API_KEY` | `ollama` | API key (use `ollama` for local Ollama) |
|
||||
| `OPENAI_MODEL` | `llama3` | Model to use |
|
||||
| `PORT` | `3000` | App port (Docker/adapter-node only) |
|
||||
|
||||
```sh
|
||||
npm run build
|
||||
```
|
||||
For Docker, set `OLLAMA_MODEL` in `.env.docker` — it controls both the model Ollama pulls and the model the app requests.
|
||||
|
||||
You can preview the production build with `npm run preview`.
|
||||
## Styles
|
||||
|
||||
> To deploy your app, you may need to install an [adapter](https://svelte.dev/docs/kit/adapters) for your target environment.
|
||||
6 categories, 25 styles: Sarcastic, Formal, British (Polite, Formal, Witty, Gentlemanly, Upper Class, Royal, Victorian, Downton Abbey), American (New Yorker, AAVE, Southern, Redneck), Pirate, Shakespearean, Gen Z, Game of Thrones (King's Landing, Wildlings, Winterfell), and Newspeak (Orwellian).
|
||||
|
||||
## Testing
|
||||
|
||||
```bash
|
||||
npm test
|
||||
```
|
||||
Reference in New Issue
Block a user