Fill-Mask
Transformers
PyTorch
Safetensors
English
nomic_bert
custom_code

Warn about megablocks more clearly and less often

#20
Nomic AI org
edited Apr 28

Purpose of this change:

  • This warning will not trigger when doing AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v1.5') because that is not an MoE model.
  • This warning is straightforward to suppress from user code if the user does not want to install Nomic's fork of megablocks:
import warnings
from transformers import AutoModel
with warnings.catch_warnings():
    warnings.filterwarnings('ignore', '.*megablocks.*')
    model = AutoModel.from_pretrained('nomic-ai/nomic-embed-text-v2-moe', trust_remote_code=True)
  • The warning mentions Nomic's fork of megablocks so it is clear that pip install megablocks will not help.
  • The sad path with the warning will be used when upstream megablocks is detected instead of trying to pass an unrecognized argument and raising an error.
Cebtenzzre changed pull request status to open
Nomic AI org
edited Apr 28

Actually there's one other change I'd like to make before this is merged.
edit: nvm

zpn changed pull request status to merged

Sign up or log in to comment