Loading `naver/splade-v3` model from transformer python library results in `ModuleNotFoundError`
#7
by
sirfumi
- opened
Hello team,
I want to do some experiments with the the splade-v3
model but are unable to load it.
How to reproduce the error:
- In a
python==3.11.11
environment install HFtransformers
librarytransformers==4.52.4
correctly installed
- Provide HF Access Token to
transformers
library - Run the following code (
tokenizer
is loaded correctly):
from transformers import AutoModelForMaskedLM, AutoTokenizer
# Load a pretrained SPLADE model and tokenizer from Hugging Face
model_id = "naver/splade-v3"
tokenizer = AutoTokenizer.from_pretrained(model_id)
- Now run
model = AutoModelForMaskedLM.from_pretrained(model_id)
which causes the following error
ModuleNotFoundError: Could not import module 'BertForMaskedLM'. Are this object's requirements defined correctly?
I have already tried re-installing the transformers
library, does anybody have a solution
Thank you in advance
Are you able to load a basic Bert model?
Hello @Herve , thank you for getting back to me.
Not only am I able to load the basic BERT model, but loading it before the splade-v3
seems to fix the issue.
I deleted the cache and it seems that by adding from transformers import BertForMaskedLM
makes the script work (even without creating any instance of it).
New code that works:
from transformers import (
AutoModelForMaskedLM,
AutoTokenizer,
BertForMaskedLM
)
# %% Load a pretrained SPLADE model and tokenizer from Hugging Face
model_id = "naver/splade-v3"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForMaskedLM.from_pretrained(model_id)
Do you have any explanation for this?