issue loading with transformers

#2
by jonabur - opened

HI, how is this meant to be loaded with transformers? I'm having an issue. Is a file missing?

>>> transformers.__version__
'4.51.3'
>>> m = AutoModelForCausalLM.from_pretrained("mrfakename/mistral-small-3.1-24b-instruct-2503-hf", device_map="auto")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/users/jonabur/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained
    return model_class.from_pretrained(
  File "/users/jonabur/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 279, in _wrapper
    return func(*args, **kwargs)
  File "/users/jonabur/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4260, in from_pretrained
    checkpoint_files, sharded_metadata = _get_resolved_checkpoint_files(
  File "/users/jonabur/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1100, in _get_resolved_checkpoint_files
    raise EnvironmentError(
OSError: mrfakename/mistral-small-3.1-24b-instruct-2503-hf does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment