llama.cpp/ggml
Romain Biessy a7417f5594
ggml-cpu: sycl: Re-enable exp f16 (#14462)
2025-06-30 14:52:02 +02:00
..
cmake ggml-cpu : rework weak alias on apple targets (#14146) 2025-06-16 13:54:15 +08:00
include ggml : implement REGLU/GEGLU/SWIGLU ops (#14158) 2025-06-29 11:04:10 +02:00
src ggml-cpu: sycl: Re-enable exp f16 (#14462) 2025-06-30 14:52:02 +02:00
.gitignore vulkan : cmake integration (#8119) 2024-07-13 18:12:39 +02:00
CMakeLists.txt ggml-cpu: enable IBM NNPA Vector Intrinsics (#14317) 2025-06-25 23:49:04 +02:00