Trouble running the model via LM Studio (MacOS)

#6
by yfrtn - opened

Hi, thank you for the great work on MiniCPM!

I'm having trouble running the model openbmb/MiniCPM-V-4_5-gguf on my Mac.
Environment details:

Device: MacBook Pro with Apple Silicon (M4 Pro chip, 24 GB RAM)
OS: macOS 15.6.1
LM Studio: 0.3.25 (Build 2)
When I try to load the model in LM Studio, I get the following error:
"😢 Failed to load the model
Error loading model.
(Exit code: 6). Please check settings and try loading the model again."

Other models (for example openai-gpt-oss-20b) work fine on the same setup. It seems the issue is specific to MiniCPM.

Could you please advise if this is a known incompatibility with macOS / Apple Silicon, or if there is a workaround (different quantization, build, or config)?

Thanks in advance!

OpenBMB org

I haven't adapted lm studio yet, so you may need to wait for the community to adapt our model.

Thanks for the clarification! That makes sense.
I’ll keep an eye out for updates from the community or OpenBMB.

Sign up or log in to comment