llama.cpp/tools/server/tests/unit
Fredrik Hultin ddf9f94389
server : add Anthropic Messages API support (#17570)
* server : add Anthropic Messages API support

* remove -@pytest.mark.slow from tool calling/jinja tests

* server : remove unused code and slow/skip on test_anthropic_vision_base64_with_multimodal_model in test_anthropic_api.py

* server : removed redundant n field logic in anthropic_params_from_json

* server : use single error object instead of error_array in streaming response handler for /v1/chat/completions and use unordered_set instead of set in to_json_anthropic_stream()

* server : refactor Anthropic API to use OAI conversion

* make sure basic test always go first

* clean up

* clean up api key check, add test

---------

Co-authored-by: Xuan Son Nguyen <son@huggingface.co>
2025-11-28 12:57:04 +01:00
..
test_basic.py server : add Anthropic Messages API support (#17570) 2025-11-28 12:57:04 +01:00
test_chat_completion.py
test_compat_anthropic.py server : add Anthropic Messages API support (#17570) 2025-11-28 12:57:04 +01:00
test_completion.py server : handle failures to restore host cache (#17078) 2025-11-09 14:27:05 +02:00
test_ctx_shift.py
test_embedding.py
test_infill.py
test_lora.py
test_rerank.py
test_security.py server : add Anthropic Messages API support (#17570) 2025-11-28 12:57:04 +01:00
test_slot_save.py
test_speculative.py kv-cache : pad the cache size to 256 for performance (#17046) 2025-11-07 20:03:25 +02:00
test_template.py
test_tokenize.py
test_tool_call.py
test_vision_api.py