llama.cpp/tools/server/webui/src/lib/services
JvM 4ef9301e4d
webui: add "Send message on Enter" setting (#21577)
* webui: make Enter to send chat a setting

* Shorten description

* Use isMobile hook from $lib/hooks

* Rebuild static output
2026-04-09 12:26:27 +02:00
..
chat.service.ts webui: Add option to pre-encode conversation for faster next turns (#21034) 2026-04-09 09:10:18 +02:00
database.service.ts webui: Conversation forking + branching improvements (#21021) 2026-03-28 13:38:15 +01:00
index.ts webui: Agentic Loop + MCP Client with support for Tools, Resources and Prompts (#18655) 2026-03-06 10:00:39 +01:00
mcp.service.ts fix: include API key in CORS proxy requests for MCP connections (#21193) 2026-03-31 10:52:34 +02:00
models.service.ts webui: Improve model parsing logic + add unit tests (#20749) 2026-03-19 12:25:50 +01:00
parameter-sync.service.spec.ts common/parser: add proper reasoning tag prefill reading (#20424) 2026-03-19 16:58:21 +01:00
parameter-sync.service.ts webui: add "Send message on Enter" setting (#21577) 2026-04-09 12:26:27 +02:00
props.service.ts webui: Architecture and UI improvements (#19596) 2026-02-14 09:06:41 +01:00