llama.cpp/awq-py/requirements.txt

3 lines
34 B
Plaintext

torch>=2.0.0
transformers>=4.32.0