llama.cpp/tools/server/webui/src
Pascal 1faa13a118
webui: updated the chat service to only include max_tokens in the req… (#16489)
* webui: updated the chat service to only include max_tokens in the request payload when the setting is explicitly provided, while still mapping explicit zero or null values to the infinite-token sentinel

* chore: update webui build output
2025-10-09 22:54:57 +02:00
..
lib webui: updated the chat service to only include max_tokens in the req… (#16489) 2025-10-09 22:54:57 +02:00
routes fix: track viewportHeight via window.innerHeight to avoid unwanted scrolling (#16356) 2025-10-03 08:01:31 +02:00
stories refactor: centralize CoT parsing in backend for streaming mode (#16394) 2025-10-08 23:18:41 +03:00
app.css Improve code block color theming (#16325) 2025-10-01 15:54:42 +02:00
app.d.ts SvelteKit-based WebUI (#14839) 2025-09-17 19:29:13 +02:00
app.html SvelteKit-based WebUI (#14839) 2025-09-17 19:29:13 +02:00
demo.spec.ts SvelteKit-based WebUI (#14839) 2025-09-17 19:29:13 +02:00