llama.cpp/tools/server/webui/src/lib
Aleksander Grygier f989a6e39e
webui: Static build output improvements (#21667)
* refactor: Build improvements

* chore: Formatting + package lock update
2026-04-10 11:49:47 +02:00
..
actions webui: Improve Chat Messages initial scroll + auto-scroll logic + add lazy loading with transitions to content blocks (#20999) 2026-03-27 17:01:36 +01:00
components webui: Static build output improvements (#21667) 2026-04-10 11:49:47 +02:00
constants webui: add "Send message on Enter" setting (#21577) 2026-04-09 12:26:27 +02:00
contexts webui: Conversation forking + branching improvements (#21021) 2026-03-28 13:38:15 +01:00
enums Add SLEEPING status to the WebUI model selector (#20949) 2026-03-25 11:02:32 +01:00
hooks webui: Improve Chat Messages initial scroll + auto-scroll logic + add lazy loading with transitions to content blocks (#20999) 2026-03-27 17:01:36 +01:00
markdown Fix rtl text rendering (#21382) 2026-04-07 11:37:20 +02:00
services webui: add "Send message on Enter" setting (#21577) 2026-04-09 12:26:27 +02:00
stores webui: add "Send message on Enter" setting (#21577) 2026-04-09 12:26:27 +02:00
types webui : store reasoning_content so it is sent back in subsequent requests (#21249) 2026-04-07 13:32:44 +02:00
utils fix: Detect streaming state in reasoning content blocks (#21549) 2026-04-07 12:04:41 +02:00