* gguf-py: Bump sentencepiece version There's a new version that's been out for a while that addresses the issues mentioned in https://github.com/ggml-org/llama.cpp/pull/14200. There's a long chain of reasons I would like this change, but the short version is that it allows people who use both `sentencepiece` and `gguf` to take advantage of these fixes. On conda-forge, currently, it locks the version (since there is no notion of optional dependencies). Regardless, I don't think this should be too controversial. * review feedback |
||
|---|---|---|
| .. | ||
| requirements-all.txt | ||
| requirements-compare-llama-bench.txt | ||
| requirements-convert_hf_to_gguf.txt | ||
| requirements-convert_hf_to_gguf_update.txt | ||
| requirements-convert_legacy_llama.txt | ||
| requirements-convert_llama_ggml_to_gguf.txt | ||
| requirements-convert_lora_to_gguf.txt | ||
| requirements-gguf_editor_gui.txt | ||
| requirements-pydantic.txt | ||
| requirements-server-bench.txt | ||
| requirements-test-tokenizer-random.txt | ||
| requirements-tool_bench.txt | ||