Multiple claims that this model is "... not compatible ..."

#14
by dakerholdings - opened

I was initially hoping to convert this to mlx, but apparently it took days to download it (the download would slow to a crawl after some gigs had been downloaded & I'd have to ctrl-c & restart it again the next night) before refusing to convert it!?

ds@Ds-MacBook-Air ~ % mlx_lm.convert --hf-path openbmb/MiniCPM-V-4_5 -q --quant-predicate [... omitted ...]
[INFO] Loading ... Fetching 21 files: ... 100%| ERROR:root:Model type minicpmv not supported.

So, after reading your 'model card' which claims that llama.cpp supports it, I tried that:

ds@Ds-MacBook-Air bin % ./llama-cli -hf openbmb/MiniCPM-V-4_5
error from HF API, response code: 400, data: {"error":"Repository is not GGUF or is not compatible with llama.cpp"}
ds@Ds-MacBook-Air bin % ./llama-cli --version
version: 6318 (81017865)
built with Apple clang version 15.0.0 (clang-1500.3.9.4) for arm64-apple-darwin23.6.0

[ I also tried ollama: ]
ds@Ds-MacBook-Air ~ % ollama run hf.co/openbmb/MiniCPM-V-4_5
pulling manifest
Error: pull model manifest: 400: {"error":"Repository is not GGUF or is not compatible with llama.cpp"}

I'm wondering if I'm now required to convert to gguf, or? I hadn't downloaded that one 'cause mlx_lm didn't appear to support!?

OpenBMB org

https://github.com/OpenSQZ/MiniCPM-V-CookBook/blob/main/deployment/llama.cpp/minicpm-v4_5_llamacpp.md
https://github.com/OpenSQZ/MiniCPM-V-CookBook/blob/main/deployment/ollama/minicpm-v4_5_ollama.md

I received your issue, but after seeing the usage commands you sent, I have to say that these usages are not correct.
I sent you llama.cpp and ollama's usage documentation, you can refer to them, I hope this can help you.

Both of these 'usage documentation' examples appear to refer to 'gguf' format files, which you apparently have available as a different model download (though the 'model card' for both claims to support ollama+).

OpenBMB org

Refer to the documentation for llama.cpp.
You can convert the gguf model yourself or download it directly from our converted repository.
https://huggingface.co/openbmb/MiniCPM-V-4_5-gguf

Sign up or log in to comment