llama.cpp/requirements
Alex Trotta 3228e77287
gguf-py : bump sentencepiece version (#19319)
* gguf-py: Bump sentencepiece version

There's a new version that's been out for a while that addresses the issues mentioned in https://github.com/ggml-org/llama.cpp/pull/14200. There's a long chain of reasons I would like this change, but the short version is that it allows people who use both `sentencepiece` and `gguf` to take advantage of these fixes. On conda-forge, currently, it locks the version (since there is no notion of optional dependencies).

Regardless, I don't think this should be too controversial.

* review feedback
2026-02-06 21:05:19 +01:00
..
requirements-all.txt model-conversion : add support for SentenceTransformers (#16387) 2025-10-09 14:35:22 +02:00
requirements-compare-llama-bench.txt
requirements-convert_hf_to_gguf.txt convert : Make mistral-common dependency optional (#16738) 2025-10-23 15:54:46 +02:00
requirements-convert_hf_to_gguf_update.txt
requirements-convert_legacy_llama.txt gguf-py : bump sentencepiece version (#19319) 2026-02-06 21:05:19 +01:00
requirements-convert_llama_ggml_to_gguf.txt
requirements-convert_lora_to_gguf.txt
requirements-gguf_editor_gui.txt
requirements-pydantic.txt
requirements-server-bench.txt
requirements-test-tokenizer-random.txt
requirements-tool_bench.txt server: /v1/responses (partial) (#18486) 2026-01-21 17:47:23 +01:00