v0.2.0 โ€” Latest Release
Llama Recipe Manager

Llama Recipe Manager

Your llama-server configurations, saved as named recipes. Switch models and GPU layouts in one click โ€” no flag juggling.

Rust Tauri 2 Svelte 5 SQLite
๐Ÿฆ™ Mistral 7B
8k context ยท 4-bit
๐Ÿฆ™ Llama 3.3 70B
128k context ยท Q4
๐Ÿฆ™ CodeLlama 13B
16k context ยท 8-bit
๐Ÿฆ™ Gemma 2 9B
8k context ยท 4-bit

Mistral 7B

โ–ถ Run
--model ./models/mistral-7b.Q4_K_M.gguf --ctx-size 8192 --gpu-layers 35 --mlock --threads 8 --batch-size 512
GPU 72%
Memory 5.2 GB
Speed 42 tok/s
Temperature 0.7

Why this exists

Stop memorizing flags. Start managing recipes.

๐Ÿ“ฆ

Named Recipes

Save each llama-server invocation as a named recipe. Switch between Mistral, Llama, and CodeLlama setups with a single click.

๐Ÿ”’

Your Data Stays Local

No account. No telemetry. No network calls. Every recipe lives in a SQLite database on your machine โ€” yours, and yours alone.

๐Ÿ›ก๏ธ

Safety by Default

A flag deny-list blocks arguments that conflict with app-managed settings. Model paths are confined to your configured directory.

๐ŸŽ

macOS

.dmg .tar.gz
๐Ÿง

Linux

.deb .rpm .AppImage
๐ŸชŸ

Windows

.msi .exe