llama.cpp/ggml
Akarshan ed0aab1ec3
SYCL: disable faulty fp16 CPU exponent for now
2025-06-29 19:10:16 +05:30
..
cmake ggml-cpu : rework weak alias on apple targets (#14146) 2025-06-16 13:54:15 +08:00
include ggml : implement REGLU/GEGLU/SWIGLU ops (#14158) 2025-06-29 11:04:10 +02:00
src SYCL: disable faulty fp16 CPU exponent for now 2025-06-29 19:10:16 +05:30
.gitignore vulkan : cmake integration (#8119) 2024-07-13 18:12:39 +02:00
CMakeLists.txt ggml-cpu: enable IBM NNPA Vector Intrinsics (#14317) 2025-06-25 23:49:04 +02:00