Ollama incompatibility
Hey, it seems current version of Ollama and the version before it are incompatible.
I am running Win11, 7900xtx combo here. The error I am seeing is this one:
Error: 500 Internal Server Error: llama runner process has terminated: exit status 0xc0000409
It occurs on both official Ollama distribution and also on the GGUF combined files into Ollama (custom model creation).
https://github.com/tc-mb/ollama/tree/MIniCPM-V
https://github.com/OpenSQZ/MiniCPM-V-CookBook/blob/main/deployment/ollama/minicpm-v4_5_ollama.md
Our PR has not been merged yet, you can use the branch we provided and deploy it using this document.
It cannot run on the Jackson series.
@redyuan43
What is the Jackson series device?
I have limited knowledge of it and may need more information or log files so I can help you.
not working on my Mac mini M4
Error: 500 Internal Server Error: llama runner process has terminated: error:attach failed: attach failed (Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.)
It looks like you didn't start ./ollama serve
Same problem here, it's probably a problem with the ollama<->model connection, because all the rest of the models working just fine on my mac and I've also tried it with a Nvidia machine. So the problem is probably not a Mac specific.
@Yingzir
@imagick
You can refer to my answer.
Below is a more detailed explanation:
Your error indicates that your code is making a network request but can't find the service.
Ollama requires a server to be enabled.
You need to run ./ollama serve in one command line.
Then, in another command line, you can run the model using ./ollama run xxx.
@imagick
I think you're still using it incorrectly. Please refer to this document for guidance.
https://github.com/OpenSQZ/MiniCPM-V-CookBook/blob/main/deployment/ollama/minicpm-v4_5_ollama.md
https://ollama.com/openbmb/minicpm-v4.5
The ollama model repository is here, and usage is simple.
./ollama run openbmb/minicpm-v4.5
Did you follow the documentation, clone my branch, compile ollama yourself (recommended), or use the official ollama?
Ok, glad you've solved the problem.