Santhosh Janardhanan 11bb42240a feat: show model name below conversion result
Add muted 'Responded by {model}' line below the output text so the
user knows which LLM produced the result. The model name comes from
the server-side LLM config (OPENAI_MODEL env var, default: llama3)
and is passed through the API response.
2026-04-13 00:05:46 -04:00
2026-04-12 21:22:34 -04:00
2026-04-12 21:22:34 -04:00
2026-04-12 21:22:34 -04:00
2026-04-12 21:22:34 -04:00
2026-04-12 21:22:34 -04:00

sv

Everything you need to build a Svelte project, powered by sv.

Creating a project

If you're seeing this, you've probably already done this step. Congrats!

# create a new project
npx sv create my-app

To recreate this project with the same configuration:

# recreate this project
npx sv@0.15.1 create --template minimal --types ts --no-install .

Developing

Once you've created a project and installed dependencies with npm install (or pnpm install or yarn), start a development server:

npm run dev

# or start the server and open the app in a new browser tab
npm run dev -- --open

Building

To create a production version of your app:

npm run build

You can preview the production build with npm run preview.

To deploy your app, you may need to install an adapter for your target environment.

Description
No description provided
Readme 152 KiB
Languages
TypeScript 50.3%
Svelte 43%
CSS 3.6%
HTML 1.5%
Dockerfile 1%
Other 0.6%