You are not logged in.
Pages: 1
Topic closed
I have installed `ollama` from the repo via `pacman` as well as the ROCm packages `rocm-hip-sdk rocm-opencl-sdk`. I am running the `mistral` model and it only uses the CPU even though the ollama logs show ROCm detected. I verified that ollama is using the CPU via `htop` and `nvtop`. CPU is AMD 7900x, GPU is AMD 7900xtx. My system is on `Linux arch 6.7.2-arch1-1 ` Thanks in advance for your help!
Offline
Is clblast installed ?
See https://github.com/ollama/ollama/blob/m … x-rocm-amd
neither of cuda, rocm are present in the clean chroot when the repo package is build, you may have to build your own version.
Last edited by Lone_Wolf (2024-01-30 12:55:04)
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Offline
Yes I have
clblast 1.6.1-1
installed. I was hoping existing packages would work out of the box so that I don't need to build my own.
ollama log shows "INFO ROCm integrated GPU detected - ROCR_VISIBLE_DEVICES=1"
I think 1 indicates it is using CPU's integrated GPU instead of the external GPU. I've tried `export ROCR_VISIBLE_DEVICES=0` and restarted ollama service but the log is still showing 1.
Any thoughts on how to set this ROCR variable to 0 properly?
Offline
For an export in your local shell to be considered you need to start ollama there. If you want to export this as part of the service file, you can set the Environment= key on the service, e.g. https://wiki.archlinux.org/title/Systemd#Drop-in_files and add
[Service]
Environment=ROCR_VISIBLE_DEVICES=0
Offline
Thank you! I didn't know about drop-in files. So, I created the unit.d folder and added ollama.conf file with the [Service] content you provided. Restarted the daemon but systemctl status ollama still shows ROCR_VISIBLE_DEVICES=1. So I must be doing something wrong...
Full path for the drop-in file is: /etc/systemd/system/unit.d/ollama.conf
Last edited by noahsark (2024-01-30 16:23:43)
Offline
I'm in the same boat, trying to get ollama to use my Radeon 7900XTX.
I found this ollama issue about ROCR_VISIBLE_DEVICES which led me to this ollama PR that is meant to ignore integrated AMD GPUs. The change was included with ollama 0.1.22.
In my case running ollama 0.1.22 correctly sets ROCR_VISIBLE_DEVICES=0, but it then goes and uses the CPU instead of the discrete GPU. I will open a new issue in ollama about it, seems like a new bug.
> Full path for the drop-in file is: /etc/systemd/system/unit.d/ollama.conf
Most likely you won't need it once you update ollama to 0.1.22, but you could try /etc/systemd/system/ollama.service.d/env.conf.
Offline
Have you had much luck with this? I still haven't managed to get it running using the GPU.
Offline
ROCm requires elevated privileges to access the GPU at runtime. On most distros you can add your user account to the render group, or run as root.
Have you added your user to the render group or do you run it as root?
Offline
If you have IGPU you need to disable it from BIOS inorder for ROCM to work properly, this solved my issue.
check this warning in ROCm website https://rocm.docs.amd.com/projects/inst … stall.html
Offline
I have good and bad news.
The good is that I got ollama to work with my AMD GPU since version 0.1.29, just got to build it from source. You can read about my bug report at https://github.com/ollama/ollama/issues/2411.
The bad news is that Arch Linux's ollama package doesn't detect the ROCm library, so it falls back to CPU inference, but at least now it's not an upstream bug. I opened an issue at https://gitlab.archlinux.org/archlinux/ … -/issues/1 but so far no reply.
Offline
If you have IGPU you need to disable it from BIOS inorder for ROCM to work properly, this solved my issue.
I have an iGPU and didn't have to disable it for ollama to work. Usually you could set `HIP_VISIBLE_DEVICES=0` (or 1, depends on the order the devices are numbered) to force the use of a particular GPU. But this is unnecessary with new versions of ollama, they now ignore iGPUs.
Offline
Here Rx 5500 XT and i couldn't make it work in ollama, pls help..
Offline
The Arch ollama maintainer needs help testing the ROCm build, please check this issue and help if you can: https://gitlab.archlinux.org/archlinux/ … -/issues/1
Offline
Good news: the new ollama-rocm package works out of the box, use it if you want to use ollama with an AMD GPU.
Offline
Is ollama-rocm-git correct for an AMD RX6800 or is ollama-vulkan-git the play here?
Offline
Good news: the new ollama-rocm package works out of the box, use it if you want to use ollama with an AMD GPU.
What do you need for this to work? Neither on my desktop (RX 6700) nor on my laptop does it work
I just have mesa installed and ollama-rocm...
time=2024-05-24T11:25:54.949+02:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu rocm]"
time=2024-05-24T11:25:54.964+02:00 level=WARN source=amd_linux.go:48 msg="ollama recommends running the https://www.amd.com/en/support/linux-drivers" error="amdgpu version file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory"
time=2024-05-24T11:25:54.965+02:00 level=WARN source=amd_linux.go:296 msg="amdgpu is not supported" gpu=0 gpu_type=gfx1035 library=/opt/rocm/lib supported_types="[gfx1030 gfx1100 gfx1101 gfx1102 gfx900 gfx906 gfx908 gfx90a gfx940 gfx941 gfx942]"
I see, RX6700 is not under supported
trying with `HSA_OVERRIDE_GFX_VERSION="10.3.0" ollama serve` on the laptop
Last edited by Humar (2024-05-24 09:30:09)
Offline
Is ollama-rocm-git correct for an AMD RX6800 or is ollama-vulkan-git the play here?
No idea, but you could try both and check the logs (journalctl -u ollama) or run ollama interactively (sudo -u ollama ollama serve).
Offline
Haplo2 wrote:Good news: the new ollama-rocm package works out of the box, use it if you want to use ollama with an AMD GPU.
What do you need for this to work? Neither on my desktop (RX 6700) nor on my laptop does it work
I just have mesa installed and ollama-rocm...
No idea, you should post more details instead of a generic "does not work". Any messages or errors? See my previous message.
Offline
OK not 100% ArchLinux (bit of Manjaro, sorry for moderator), but it could help.
I put a comment here which should help getting AMD GPU working.
Offline
This is how I got Ollama working with my RX 6750 XT in docker.
docker run -d --restart always --device /dev/kfd --device /dev/dri -v ollama:/root/.ollama -p 11434:11434 --name ollama -m 16g --memory-swap=-1 --cpus="4" -e HSA_OVERRIDE_GFX_VERSION=10.3.0 -e HCC_AMDGPU_TARGET=gfx1031 ollama/ollama:rocm
Offline
OP hasn't visted the forum since march , closing this thread.
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Offline
Pages: 1
Topic closed