Missing params.json

#2
by bingw5 - opened

Which is required for vLLM to load the model.

That's too bad. Are you aware of any quantizations of this model that will work on vLLM yet?

Unsloth AI org

Which is required for vLLM to load the model.

That's too bad. Are you aware of any quantizations of this model that will work on vLLM yet?

You can use the standard 4bit BnB one: https://huggingface.co/unsloth/Mistral-Small-3.1-24B-Instruct-2503-bnb-4bit

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment