Please release a GGUF for the latest version (2025-06-21)

#74
by KeilahElla - opened

Without a gguf we can't use the model with llama.cpp, which means hardly anyone will use it.

I agree.
I tried converting it myself, but to no avail.
Using it directly through Transformers is not an option, as I really dislike Python.
Moondream Server and Moondream Station are convenient when only this model is used.

Sign up or log in to comment