llama.cpp/requirements
Sigbjørn Skjæret 8fc85db9d2
ci : limit requirements versions (#20980)
* set requests version

* limit versions outside requirements
2026-03-25 10:55:37 +02:00
..
requirements-all.txt model-conversion : add support for SentenceTransformers (#16387) 2025-10-09 14:35:22 +02:00
requirements-compare-llama-bench.txt compare-llama-bench: add option to plot (#14169) 2025-06-14 10:34:20 +02:00
requirements-convert_hf_to_gguf.txt convert : Make mistral-common dependency optional (#16738) 2025-10-23 15:54:46 +02:00
requirements-convert_hf_to_gguf_update.txt ci : check that pre-tokenizer hashes are up-to-date (#15032) 2025-08-02 14:39:01 +02:00
requirements-convert_legacy_llama.txt gguf-py : bump sentencepiece version (#19319) 2026-02-06 21:05:19 +01:00
requirements-convert_llama_ggml_to_gguf.txt py : switch to snake_case (#8305) 2024-07-05 07:53:33 +03:00
requirements-convert_lora_to_gguf.txt common: Include torch package for s390x (#13699) 2025-05-22 21:31:29 +03:00
requirements-gguf_editor_gui.txt gguf-py : add support for sub_type (in arrays) in GGUFWriter add_key_value method (#13561) 2025-05-29 15:36:05 +02:00
requirements-pydantic.txt ci : limit requirements versions (#20980) 2026-03-25 10:55:37 +02:00
requirements-server-bench.txt scripts: benchmark for HTTP server throughput (#14668) 2025-07-14 13:14:30 +02:00
requirements-test-tokenizer-random.txt py : type-check all Python scripts with Pyright (#8341) 2024-07-07 15:04:39 -04:00
requirements-tool_bench.txt server: /v1/responses (partial) (#18486) 2026-01-21 17:47:23 +01:00