llama.cpp/models
Olivier Chafik 669912d9a5
`tool-call`: fix Qwen 2.5 Coder support, add micro benchmarks, support trigger patterns for lazy grammars (#12034)
* sampler: turn lazy grammar trigger words to regexes

* add scripts/tool_bench.sh & .py

* constrain llama json output regardless of function name if matches at beginning

* update relaxed newline space rule in grammar tests

* support add_generation_prompt query parameter (useful for /apply_template)

* Update src/llama-grammar.cpp

Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>

---------

Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>
2025-03-05 13:05:13 +00:00
..
templates `tool-call`: fix Qwen 2.5 Coder support, add micro benchmarks, support trigger patterns for lazy grammars (#12034) 2025-03-05 13:05:13 +00:00
.editorconfig
ggml-vocab-aquila.gguf
ggml-vocab-baichuan.gguf
ggml-vocab-bert-bge.gguf
ggml-vocab-bert-bge.gguf.inp
ggml-vocab-bert-bge.gguf.out
ggml-vocab-chameleon.gguf.inp
ggml-vocab-chameleon.gguf.out
ggml-vocab-command-r.gguf
ggml-vocab-command-r.gguf.inp
ggml-vocab-command-r.gguf.out
ggml-vocab-deepseek-coder.gguf
ggml-vocab-deepseek-coder.gguf.inp
ggml-vocab-deepseek-coder.gguf.out
ggml-vocab-deepseek-llm.gguf
ggml-vocab-deepseek-llm.gguf.inp
ggml-vocab-deepseek-llm.gguf.out
ggml-vocab-deepseek-r1-qwen.gguf.inp llama : add support for Deepseek-R1-Qwen distill model (#11310) 2025-01-20 14:35:07 +01:00
ggml-vocab-deepseek-r1-qwen.gguf.out llama : add support for Deepseek-R1-Qwen distill model (#11310) 2025-01-20 14:35:07 +01:00
ggml-vocab-falcon.gguf
ggml-vocab-falcon.gguf.inp
ggml-vocab-falcon.gguf.out
ggml-vocab-gpt-2.gguf
ggml-vocab-gpt-2.gguf.inp
ggml-vocab-gpt-2.gguf.out
ggml-vocab-gpt-4o.gguf.inp llama : add Phi-4-mini support (supersede #12099) (#12108) 2025-02-28 12:44:11 +01:00
ggml-vocab-gpt-4o.gguf.out llama : add Phi-4-mini support (supersede #12099) (#12108) 2025-02-28 12:44:11 +01:00
ggml-vocab-gpt-neox.gguf
ggml-vocab-llama-bpe.gguf
ggml-vocab-llama-bpe.gguf.inp
ggml-vocab-llama-bpe.gguf.out
ggml-vocab-llama-spm.gguf
ggml-vocab-llama-spm.gguf.inp
ggml-vocab-llama-spm.gguf.out
ggml-vocab-mpt.gguf
ggml-vocab-mpt.gguf.inp
ggml-vocab-mpt.gguf.out
ggml-vocab-phi-3.gguf
ggml-vocab-phi-3.gguf.inp
ggml-vocab-phi-3.gguf.out
ggml-vocab-qwen2.gguf
ggml-vocab-qwen2.gguf.inp
ggml-vocab-qwen2.gguf.out
ggml-vocab-refact.gguf
ggml-vocab-refact.gguf.inp
ggml-vocab-refact.gguf.out
ggml-vocab-roberta-bpe.gguf.inp
ggml-vocab-roberta-bpe.gguf.out
ggml-vocab-starcoder.gguf
ggml-vocab-starcoder.gguf.inp
ggml-vocab-starcoder.gguf.out