Error when loading model by code in Model card
#6
by
XibinBayesZhou
- opened
Hi there,
I'm following your instruction in Model card,
model = AutoModelForCausalLM.from_pretrained("/my/local/path/stablelm-3b-4e1t", trust_remote_code=True, torch_dtype="auto")
but the error occurs as follows
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory /my/local/path/stablelm-3b-4e1t.
what happens? what should I do?
Thank you guys!
Hi
@XibinBayesZhou
! What are the contents of ls "/my/local/path/stablelm-3b-4e1t"
? You should be able to load the model by pointing to the model name instead of the path (unless you're locally modifying the modeling code, etc.):
model = AutoModelForCausalLM.from_pretrained(
"stabilityai/stablelm-3b-4e1t", # Using model name only
trust_remote_code=True,
torch_dtype="auto",
)
Can you try running pip install -U transformers
? The latest version supports loading safetensor
s from paths. Let me know if it's still an issue!