llama.cpp/.devops
Andrew Aladjev fb644247de
CLI: fixed adding cli and completion into docker containers, improved docs (#18003)
Co-authored-by: Andrew Aladjev <andrew.aladjev@gmail.com>
2025-12-16 11:52:23 +01:00
..
nix Install rpc-server when GGML_RPC is ON. (#17149) 2025-11-11 10:53:59 +00:00
cann.Dockerfile CLI: fixed adding cli and completion into docker containers, improved docs (#18003) 2025-12-16 11:52:23 +01:00
cpu.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
cuda.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
intel.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
llama-cli-cann.Dockerfile CLI: fixed adding cli and completion into docker containers, improved docs (#18003) 2025-12-16 11:52:23 +01:00
llama-cpp-cuda.srpm.spec CLI: fixed adding cli and completion into docker containers, improved docs (#18003) 2025-12-16 11:52:23 +01:00
llama-cpp.srpm.spec CLI: fixed adding cli and completion into docker containers, improved docs (#18003) 2025-12-16 11:52:23 +01:00
musa.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
rocm.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
s390x.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
tools.sh docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00
vulkan.Dockerfile docker : include legacy llama-completion binary (#17964) 2025-12-12 19:39:23 +01:00