llama.cpp/ggml/src/ggml-opencl
Junil Kim f423981ac8
opencl : fix memory allocation size (#12649)
issue:
https://github.com/CodeLinaro/llama.cpp/pull/17#issuecomment-2760611283

This patch fixes the memory allocation size
not exceeding the maximum size of the OpenCL device.
2025-04-01 09:54:34 -07:00
..
kernels opencl: add multi and vision rope, `gelu_quick` and `im2col` (#12600) 2025-03-27 08:08:08 -07:00
CMakeLists.txt opencl: add multi and vision rope, `gelu_quick` and `im2col` (#12600) 2025-03-27 08:08:08 -07:00
ggml-opencl.cpp opencl : fix memory allocation size (#12649) 2025-04-01 09:54:34 -07:00