You are not logged in.
Pages: 1
It seems like the latest 0.12.6 has supported Vulkan, but it still runs on CPU when I run ollama.
ollama-rocm works, but all other people say that ROCm is much slower than Vulkan
Last edited by laichiaheng (2025-10-20 13:12:45)
Offline
Not sure about ollama but as an alternative you can use llama.cpp that supports both ROCm and Vulkan. You can even make it easier by using the lemonade-server that also has 7.9.0 ROCm builds (tech preview)
Offline
archlinux ollama PKGBUILD
depends=(gcc-libs glibc)
makedepends=(cmake ninja git go rocm-toolchain hipblas cuda clblast)https://github.com/ollama/ollama/blob/m … s.txt#L143 indicates ollama will only enable vulkan support if it detects vulkan.
repo ollama isn't built with vulkan present .
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Offline
archlinux ollama PKGBUILD
depends=(gcc-libs glibc) makedepends=(cmake ninja git go rocm-toolchain hipblas cuda clblast)https://github.com/ollama/ollama/blob/m … s.txt#L143 indicates ollama will only enable vulkan support if it detects vulkan.
repo ollama isn't built with vulkan present .
Why doesn't ArchLinux build ollama with Vulkan?
Offline
That sounds fixable, please raise a feature request or Merge Request (https://wiki.archlinux.org/title/Genera … e_requests) on the Package Repo: https://gitlab.archlinux.org/archlinux/ … ges/ollama
Offline
Why doesn't ArchLinux build ollama with Vulkan?
Likely because nobody has requested it.
Offline
There's now an ollama-vulkan package for anyone finding this thread.
Offline
Pages: 1