Ollama incompatibility

#1
by razorleaf - opened

Hey, it seems current version of Ollama and the version before it are incompatible.

I am running Win11, 7900xtx combo here. The error I am seeing is this one:
Error: 500 Internal Server Error: llama runner process has terminated: exit status 0xc0000409

It occurs on both official Ollama distribution and also on the GGUF combined files into Ollama (custom model creation).

OpenBMB org

https://github.com/tc-mb/ollama/tree/MIniCPM-V
https://github.com/OpenSQZ/MiniCPM-V-CookBook/blob/main/deployment/ollama/minicpm-v4_5_ollama.md

Our PR has not been merged yet, you can use the branch we provided and deploy it using this document.

It cannot run on the Jackson series.

OpenBMB org

@redyuan43 What is the Jackson series device?
I have limited knowledge of it and may need more information or log files so I can help you.

not working on my Mac mini M4

Error: 500 Internal Server Error: llama runner process has terminated: error:attach failed: attach failed (Not allowed to attach to process.  Look in the console messages (Console.app), near the debugserver entries, when the attach failed.  The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.)
OpenBMB org

It looks like you didn't start ./ollama serve

Same problem here, it's probably a problem with the ollama<->model connection, because all the rest of the models working just fine on my mac and I've also tried it with a Nvidia machine. So the problem is probably not a Mac specific.

OpenBMB org

@Yingzir @imagick
You can refer to my answer.

Below is a more detailed explanation:
Your error indicates that your code is making a network request but can't find the service.
Ollama requires a server to be enabled.

You need to run ./ollama serve in one command line.
Then, in another command line, you can run the model using ./ollama run xxx.

Hey tc-mb, the server is up and as I mentioned, it was tested with two different machines that succeeded with other models on the same exact environment.

image.png

OpenBMB org

@imagick
I think you're still using it incorrectly. Please refer to this document for guidance.
https://github.com/OpenSQZ/MiniCPM-V-CookBook/blob/main/deployment/ollama/minicpm-v4_5_ollama.md

https://ollama.com/openbmb/minicpm-v4.5
The ollama model repository is here, and usage is simple.
./ollama run openbmb/minicpm-v4.5

@tc-mb , thanks for the help, but I just made a reinstall and just a simple ollama run openbmb/minicpm-v4.5 and it ended up the same way, unlike other models which work just fine the same way.

OpenBMB org

Did you follow the documentation, clone my branch, compile ollama yourself (recommended), or use the official ollama?

This comment has been hidden

@tc-mb thanks I didn't know I needed to use that branch so it finally worked, but that was the real reason and not the fact that I didn't use ollama serve

OpenBMB org

Ok, glad you've solved the problem.

Sign up or log in to comment