llama.cpp/tools/server/webui/src/lib
Hendrik Erz 3802d3c78f
fix: Use `tabular-nums` for chat message statistics (#18915)
* fix: Use `tabular-nums` for chat message statistics

* fix: Rebuild WebUI
2026-01-21 18:46:01 +01:00
..
components fix: Use `tabular-nums` for chat message statistics (#18915) 2026-01-21 18:46:01 +01:00
constants sampling : add support for backend sampling (#17004) 2026-01-04 22:22:16 +02:00
enums webui: display prompt processing stats (#18146) 2025-12-18 17:55:03 +01:00
hooks webui: fix prompt progress ETA calculation (#18468) 2025-12-29 21:42:11 +01:00
markdown webui: Fix selecting generated output issues during active streaming (#18091) 2025-12-18 11:13:52 +01:00
services sampling : add support for backend sampling (#17004) 2026-01-04 22:22:16 +02:00
stores sampling : add support for backend sampling (#17004) 2026-01-04 22:22:16 +02:00
types sampling : add support for backend sampling (#17004) 2026-01-04 22:22:16 +02:00
utils Webui/file upload (#18694) 2026-01-09 16:45:32 +01:00