From 1f0d90c3ddd3ab0d9d8912e5a2d090c9a110ca37 Mon Sep 17 00:00:00 2001 From: Francisco Herrera Date: Tue, 6 Jan 2026 00:04:07 -0500 Subject: [PATCH] Clarify --- docs/build.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/build.md b/docs/build.md index f9c9e4d2b4..4e4f4a9964 100644 --- a/docs/build.md +++ b/docs/build.md @@ -349,7 +349,7 @@ You can download it from your Linux distro's package manager or from here: [ROCm && cmake --build build -- -j 16 ``` - If it fails to compile, with a "target not supported" or similar error, it means your GPU does not support ROCm due to missing compiler support, even through it is an RDNA GPU. This can happen if you are trying to use an integrated GPU. In this case, build for Vulkan instead to use the GPU. + If llama.cpp fails to compile, with a "target not supported" or similar error, it means your GPU does not support ROCm due to missing compiler support, even though it is an RDNA GPU. This can happen if you are trying to use an integrated GPU. In this case, build for Vulkan instead to use the integrated GPU. - Using `CMake` for Windows (using x64 Native Tools Command Prompt for VS, and assuming a gfx1100-compatible AMD GPU): ```bash