Loading `naver/splade-v3` model from transformer python library results in `ModuleNotFoundError`

#7
by sirfumi - opened

Hello team,

I want to do some experiments with the the splade-v3 model but are unable to load it.

How to reproduce the error:

  1. In a python==3.11.11 environment install HF transformers library
    • transformers==4.52.4 correctly installed
  2. Provide HF Access Token to transformers library
  3. Run the following code (tokenizer is loaded correctly):
from transformers import AutoModelForMaskedLM, AutoTokenizer

# Load a pretrained SPLADE model and tokenizer from Hugging Face
model_id = "naver/splade-v3"
tokenizer = AutoTokenizer.from_pretrained(model_id)
  1. Now run model = AutoModelForMaskedLM.from_pretrained(model_id) which causes the following error
ModuleNotFoundError: Could not import module 'BertForMaskedLM'. Are this object's requirements defined correctly?

I have already tried re-installing the transformers library, does anybody have a solution

Thank you in advance

NAVER LABS Europe org

Are you able to load a basic Bert model?

Hello @Herve , thank you for getting back to me.

Not only am I able to load the basic BERT model, but loading it before the splade-v3 seems to fix the issue.

I deleted the cache and it seems that by adding from transformers import BertForMaskedLM makes the script work (even without creating any instance of it).

New code that works:

from transformers import (
    AutoModelForMaskedLM,
    AutoTokenizer,
    BertForMaskedLM
)

# %%    Load a pretrained SPLADE model and tokenizer from Hugging Face
model_id = "naver/splade-v3"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForMaskedLM.from_pretrained(model_id)

Do you have any explanation for this?

Sign up or log in to comment