llama.cpp/tools/server/webui/src/lib/services
Daniel Bevenius ebfe545cf9
Merge remote-tracking branch 'upstream/master' into backend-sampling
2025-12-30 07:59:02 +01:00
..
chat.ts Merge remote-tracking branch 'upstream/master' into backend-sampling 2025-12-30 07:59:02 +01:00
database.ts webui: Per-conversation system message with UI displaying, edition & branching (#17275) 2025-12-06 13:19:05 +01:00
index.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
models.ts Use OpenAI-compatible `/v1/models` endpoint by default (#17689) 2025-12-03 20:49:09 +01:00
parameter-sync.spec.ts server: (webui) add --webui-config (#18028) 2025-12-17 21:45:45 +01:00
parameter-sync.ts server: (webui) add --webui-config (#18028) 2025-12-17 21:45:45 +01:00
props.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00