llama.cpp/tools/server/webui/src/lib/stores
Aleksander Grygier e9f9483464
Use OpenAI-compatible `/v1/models` endpoint by default (#17689)
* refactor: Data fetching via stores

* chore: update webui build output

* refactor: Use OpenAI compat `/v1/models` endpoint by default to list models

* chore: update webui build output

* chore: update webui build output
2025-12-03 20:49:09 +01:00
..
chat.svelte.ts Add context info to server error (#17663) 2025-12-02 09:20:57 +01:00
conversations.svelte.ts Use OpenAI-compatible `/v1/models` endpoint by default (#17689) 2025-12-03 20:49:09 +01:00
models.svelte.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
persisted.svelte.ts webui: introduce OpenAI-compatible model selector in JSON payload (#16562) 2025-10-22 16:58:23 +02:00
server.svelte.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
settings.svelte.ts Use OpenAI-compatible `/v1/models` endpoint by default (#17689) 2025-12-03 20:49:09 +01:00