Missing params.json
#2
by
bingw5
- opened
Which is required for vLLM to load the model.
That's too bad. Are you aware of any quantizations of this model that will work on vLLM yet?
Which is required for vLLM to load the model.
That's too bad. Are you aware of any quantizations of this model that will work on vLLM yet?
You can use the standard 4bit BnB one: https://huggingface.co/unsloth/Mistral-Small-3.1-24B-Instruct-2503-bnb-4bit