llama.cpp/tools/server/webui/src/lib
Leszek Hanusz ad3b8df38f Remove currentConfig.model 2026-02-04 02:03:59 +01:00
..
components Remove inputContent var and use tokenize only when needed 2026-02-04 01:23:24 +01:00
constants Running npm run format 2026-02-03 02:27:10 +01:00
enums webui: display prompt processing stats (#18146) 2025-12-18 17:55:03 +01:00
hooks webui: fix prompt progress ETA calculation (#18468) 2025-12-29 21:42:11 +01:00
markdown webui: Fix selecting generated output issues during active streaming (#18091) 2025-12-18 11:13:52 +01:00
services Fix tokenize with router on 2026-02-04 00:21:56 +01:00
stores Remove currentConfig.model 2026-02-04 02:03:59 +01:00
types Fix tokenize with router on 2026-02-04 00:21:56 +01:00
utils Webui/file upload (#18694) 2026-01-09 16:45:32 +01:00