This website requires JavaScript.
Explore
Help
Sign In
happyz
/
llama.cpp
mirror of
https://github.com/ggerganov/llama.cpp.git
Watch
1
Star
0
Fork
You've already forked llama.cpp
0
Code
Issues
Packages
Projects
Releases
Wiki
Activity
710a2835e0
llama.cpp
/
tools
/
server
/
webui
/
src
/
lib
History
Aldehir Rojas
482192f12d
webui : store reasoning_content so it is sent back in subsequent requests (
#21249
)
2026-04-07 13:32:44 +02:00
..
actions
webui: Improve Chat Messages initial scroll + auto-scroll logic + add lazy loading with transitions to content blocks (
#20999
)
2026-03-27 17:01:36 +01:00
components
fix: Detect streaming state in reasoning content blocks (
#21549
)
2026-04-07 12:04:41 +02:00
constants
server/webui: cleanup dual representation approach, simplify to openai-compat (
#21090
)
2026-03-31 10:42:06 +02:00
contexts
webui: Conversation forking + branching improvements (
#21021
)
2026-03-28 13:38:15 +01:00
enums
Add SLEEPING status to the WebUI model selector (
#20949
)
2026-03-25 11:02:32 +01:00
hooks
webui: Improve Chat Messages initial scroll + auto-scroll logic + add lazy loading with transitions to content blocks (
#20999
)
2026-03-27 17:01:36 +01:00
markdown
Fix rtl text rendering (
#21382
)
2026-04-07 11:37:20 +02:00
services
fix: include API key in CORS proxy requests for MCP connections (
#21193
)
2026-03-31 10:52:34 +02:00
stores
webui : store reasoning_content so it is sent back in subsequent requests (
#21249
)
2026-04-07 13:32:44 +02:00
types
webui : store reasoning_content so it is sent back in subsequent requests (
#21249
)
2026-04-07 13:32:44 +02:00
utils
fix: Detect streaming state in reasoning content blocks (
#21549
)
2026-04-07 12:04:41 +02:00