error with most recent vLLM
#6
by
elbiot
- opened
I ran vllm serve models/Mistral-Small-3.1-24B-Instruct-2503-MAX-NEO-D_AU-Q4_K_S-imat.gguf --tokenizer DavidAU/Mistral-Small-3.1-24B-Instruct-2503-MAX-NEO-Imatrix-GGUF
as per https://docs.vllm.ai/en/latest/features/quantization/gguf.html
The error I get isValueError: Unrecognized model in DavidAU/Mistral-Small-3.1-24B-Instruct-2503-MAX-NEO-Imatrix-GGUF. Should have a
model_type key in its config.json
I can get past this point by using "mistralai/Mistral-Small-3.1-24B-Instruct-2503" as the tokenizer
and hf_config_path
. But then I get
RuntimeError: Unknown gguf model_type: pixtral
my bad; mine is text only.
Use:
mrfakename/mistral-small-3.1-24b-instruct-2503-hf
(source repo for the ggufs I made)