Model Card for GERM

Model Details

Model Description

This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.

  • Developed by: Haozheng Luo, ChengHao Qiu
  • License: MIT

Model Sources

Uses

Direct Use

from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("magicslabnu/GERM", trust_remote_code=True)
model = AutoModelForMaskedLM.from_pretrained("magicslabnu/GERM", trust_remote_code=True)

Training Details

Training Data

GLUE

BibTeX:

@misc{luo2025fastlowcostgenomicfoundation,
      title={Fast and Low-Cost Genomic Foundation Models via Outlier Removal}, 
      author={Haozheng Luo and Chenghao Qiu and Maojiang Su and Zhihan Zhou and Zoe Mehta and Guo Ye and Jerry Yao-Chieh Hu and Han Liu},
      year={2025},
      eprint={2505.00598},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2505.00598}, 
}
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including magicslabnu/GERM