@[email protected] to [email protected] • 4 months agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square19fedilinkarrow-up187arrow-down116
arrow-up171arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.org@[email protected] to [email protected] • 4 months agomessage-square19fedilink
minus-square@[email protected]linkfedilinkDeutsch3•4 months agoI did try to use it on Fedora but i have a Radeon 6700 XT and it only worked in the CPU. I wait until ROCM official support reaches my older Model.
minus-square@[email protected]linkfedilink3•4 months agoollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now
minus-square@[email protected]linkfedilink2•4 months agoI have the same setup, you have to add the line Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0" for that specific GPU to the ollama.service file
I did try to use it on Fedora but i have a Radeon 6700 XT and it only worked in the CPU. I wait until ROCM official support reaches my older Model.
ollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now
I have the same setup, you have to add the line
Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
for that specific GPU to the ollama.service file