llama.cpp/ggml
hongruichen be9a8c73a0 fix: suppress warning 2024-07-26 23:07:25 +08:00
..
cmake llama : reorganize source code + improve CMake (#8006) 2024-06-26 18:33:02 +03:00
include register qnn backend 2024-07-17 21:25:55 +08:00
src fix: suppress warning 2024-07-26 23:07:25 +08:00
.gitignore vulkan : cmake integration (#8119) 2024-07-13 18:12:39 +02:00
CMakeLists.txt add build step of QNN backend at ggml 2024-07-17 19:43:01 +08:00