llama.cpp/tools/server/webui/tests/stories
Aleksander Grygier 4c61875bf8
webui: Add switcher to Chat Message UI to show raw LLM output (#19571)
2026-02-12 19:55:51 +01:00
..
fixtures server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
ChatForm.stories.svelte Webui/file upload (#18694) 2026-01-09 16:45:32 +01:00
ChatMessage.stories.svelte webui: Add switcher to Chat Message UI to show raw LLM output (#19571) 2026-02-12 19:55:51 +01:00
ChatSettings.stories.svelte server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
ChatSidebar.stories.svelte server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
Introduction.mdx server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00
MarkdownContent.stories.svelte server: introduce API for serving / loading / unloading multiple models (#17470) 2025-12-01 19:41:04 +01:00