llama.cpp/.devops
Sigbjørn Skjæret 4849661d98
docker : add CUDA 13.1 image build (#18441)
* add updated cuda-new.Dockerfile for Ubuntu 24.04 compatibilty

* add cuda13 build
2025-12-30 22:28:53 +01:00
..
nix Install rpc-server when GGML_RPC is ON. (#17149) 2025-11-11 10:53:59 +00:00
cann.Dockerfile CLI: fixed adding cli and completion into docker containers, improved docs (#18003) 2025-12-16 11:52:23 +01:00
cpu.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
cuda-new.Dockerfile docker : add CUDA 13.1 image build (#18441) 2025-12-30 22:28:53 +01:00
cuda.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
intel.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
llama-cli-cann.Dockerfile CLI: fixed adding cli and completion into docker containers, improved docs (#18003) 2025-12-16 11:52:23 +01:00
llama-cpp-cuda.srpm.spec CLI: fixed adding cli and completion into docker containers, improved docs (#18003) 2025-12-16 11:52:23 +01:00
llama-cpp.srpm.spec CLI: fixed adding cli and completion into docker containers, improved docs (#18003) 2025-12-16 11:52:23 +01:00
musa.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
rocm.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
s390x.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
tools.sh docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
vulkan.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00