llama.cpp/tools/server/webui/src/lib/enums
Pascal f9ec8858ed
webui: display prompt processing stats (#18146)
* webui: display prompt processing stats

* feat: Improve UI of Chat Message Statistics

* chore: update webui build output

* refactor: Post-review improvements

* chore: update webui build output

---------

Co-authored-by: Aleksander Grygier <aleksander.grygier@gmail.com>
2025-12-18 17:55:03 +01:00
..
attachment.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
chat.ts webui: display prompt processing stats (#18146) 2025-12-18 17:55:03 +01:00
files.ts Add a couple of file types to the text section (#17670) 2025-12-03 21:45:06 +01:00
index.ts webui: display prompt processing stats (#18146) 2025-12-18 17:55:03 +01:00
model.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
server.ts server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00