Missing Ollama download
Even though this is a GGUF image, there isn't a link to download for Ollama (server not ollama.cpp) in the "Use this Model" dropdown.
Is there an easy way to download without it showing as supporting Ollama?
I had the same issue initially but seems to working for me now, with the same model name as I had tried before. I used openwebui with ollama api. The specific model I used was hf.co/unsloth/Qwen3-Coder-30B-A3B-Instruct-GGUF:IQ4_NL
Nope, refreshed several times and tried different computers. The "Use this model" button doesn't list Ollama. The non-coder model page does, which likely means this is just a setting error on the publisher.
Thankfully the foundational model put up a GUFF version on ollama.com website so I used theirs instead - but it still has the same tool use issues unfortunately.
Thanks for fixing the Ollama download!!!