llama.cpp/tests
slaren b6b9a8e606
fix CI failures (#8066)
* test-backend-ops : increase cpy max nmse

* server ci : disable thread sanitizer
2024-06-23 13:14:45 +02:00
..
.gitignore
CMakeLists.txt
get-model.cpp
get-model.h
run-json-schema-to-grammar.mjs
test-autorelease.cpp
test-backend-ops.cpp fix CI failures (#8066) 2024-06-23 13:14:45 +02:00
test-c.c
test-chat-template.cpp
test-double-float.cpp
test-grad0.cpp
test-grammar-integration.cpp
test-grammar-parser.cpp
test-json-schema-to-grammar.cpp
test-llama-grammar.cpp
test-model-load-cancel.cpp
test-opt.cpp
test-quantize-fns.cpp
test-quantize-perf.cpp
test-rope.cpp
test-sampling.cpp
test-tokenizer-0.cpp
test-tokenizer-0.py
test-tokenizer-0.sh
test-tokenizer-1-bpe.cpp
test-tokenizer-1-spm.cpp
test-tokenizer-random.py