minus-squarepassepartout@feddit.orgtoLinux@lemmy.ml•Running Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinelinkfedilinkarrow-up3arrow-down1·3 days agoI have the same setup, you have to add the line Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0" for that specific GPU to the ollama.service file linkfedilink
I have the same setup, you have to add the line
Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
for that specific GPU to the ollama.service file