This website requires JavaScript.
Explore
Help
Sign In
happyz
/
llama.cpp
mirror of
https://github.com/ggerganov/llama.cpp.git
Watch
1
Star
0
Fork
You've already forked llama.cpp
0
Code
Issues
Packages
Projects
Releases
Wiki
Activity
21e933806f
llama.cpp
/
ggml
/
src
/
ggml-cann
History
leejet
93c7e775b8
add ggml_pad_ext for cpu & cuda backend
2025-08-30 02:56:56 +08:00
..
CMakeLists.txt
CANN: add support for ACL Graph (
#15065
)
2025-08-06 14:12:42 +08:00
Doxyfile
CANN: Add the basic supports of Flash Attention kernel (
#13627
)
2025-05-26 10:20:18 +08:00
acl_tensor.cpp
CANN: Implement GLU ops (
#14884
)
2025-07-26 17:56:18 +08:00
acl_tensor.h
CANN: Add the basic supports of Flash Attention kernel (
#13627
)
2025-05-26 10:20:18 +08:00
aclnn_ops.cpp
add ggml_pad_ext for cpu & cuda backend
2025-08-30 02:56:56 +08:00
aclnn_ops.h
CANN: Add ggml_set_rows (
#14943
)
2025-07-29 22:36:43 +08:00
common.h
kv-cache : remove LLAMA_SET_ROWS checks (
#15505
)
2025-08-28 12:27:02 +03:00
ggml-cann.cpp
kv-cache : remove LLAMA_SET_ROWS checks (
#15505
)
2025-08-28 12:27:02 +03:00