ModernBERT-base trained on GooAQ

This is a Cross Encoder model finetuned from answerdotai/ModernBERT-base using the sentence-transformers library. It computes scores for pairs of texts, which can be used for text reranking and semantic search.

Model Details

Model Description

  • Model Type: Cross Encoder
  • Base model: answerdotai/ModernBERT-base
  • Maximum Sequence Length: 8192 tokens
  • Number of Output Labels: 1 label
  • Language: en
  • License: apache-2.0

Model Sources

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import CrossEncoder

# Download from the ๐Ÿค— Hub
model = CrossEncoder("ayushexel/ce-modernbert-trained-1epoch")
# Get scores for pairs of texts
pairs = [
    ['can you still get pregnant if you are infertile?', 'Many infertile couples will go on to conceive a child without treatment. After trying to get pregnant for two years, about 95 percent of couples successfully conceive.'],
    ['can you still get pregnant if you are infertile?', 'Secondary infertility is the inability to become pregnant or to carry a baby to term after previously giving birth to a baby. Secondary infertility shares many of the same causes of primary infertility. Secondary infertility might be caused by: Impaired sperm production, function or delivery in men.'],
    ['can you still get pregnant if you are infertile?', "Problems with cervical mucus can interfere with getting pregnant. Mild cases may increase the time it takes to get pregnant, but won't necessarily cause infertility."],
    ['can you still get pregnant if you are infertile?', 'No treatment can stop the process of diminished ovarian reserve, but women who are infertile due to low egg count or quality can sometimes use assisted reproductive technologies to achieve a pregnancy.'],
    ['can you still get pregnant if you are infertile?', "Human conception requires an egg and sperm. If you're not ovulating, you won't be able to get pregnant. Anovulation is a common cause of female infertility and it can be triggered by many conditions. Most women who are experiencing ovulation problems have irregular periods."],
]
scores = model.predict(pairs)
print(scores.shape)
# (5,)

# Or rank different texts based on similarity to a single text
ranks = model.rank(
    'can you still get pregnant if you are infertile?',
    [
        'Many infertile couples will go on to conceive a child without treatment. After trying to get pregnant for two years, about 95 percent of couples successfully conceive.',
        'Secondary infertility is the inability to become pregnant or to carry a baby to term after previously giving birth to a baby. Secondary infertility shares many of the same causes of primary infertility. Secondary infertility might be caused by: Impaired sperm production, function or delivery in men.',
        "Problems with cervical mucus can interfere with getting pregnant. Mild cases may increase the time it takes to get pregnant, but won't necessarily cause infertility.",
        'No treatment can stop the process of diminished ovarian reserve, but women who are infertile due to low egg count or quality can sometimes use assisted reproductive technologies to achieve a pregnancy.',
        "Human conception requires an egg and sperm. If you're not ovulating, you won't be able to get pregnant. Anovulation is a common cause of female infertility and it can be triggered by many conditions. Most women who are experiencing ovulation problems have irregular periods.",
    ]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]

Evaluation

Metrics

Cross Encoder Reranking

Metric Value
map 0.5439 (+0.1636)
mrr@10 0.5411 (+0.1708)
ndcg@10 0.5936 (+0.1609)

Cross Encoder Reranking

  • Datasets: NanoMSMARCO_R100, NanoNFCorpus_R100 and NanoNQ_R100
  • Evaluated with CrossEncoderRerankingEvaluator with these parameters:
    {
        "at_k": 10,
        "always_rerank_positives": true
    }
    
Metric NanoMSMARCO_R100 NanoNFCorpus_R100 NanoNQ_R100
map 0.3929 (-0.0967) 0.3119 (+0.0509) 0.3869 (-0.0327)
mrr@10 0.3751 (-0.1024) 0.4287 (-0.0711) 0.3837 (-0.0429)
ndcg@10 0.4428 (-0.0976) 0.3140 (-0.0110) 0.4316 (-0.0691)

Cross Encoder Nano BEIR

  • Dataset: NanoBEIR_R100_mean
  • Evaluated with CrossEncoderNanoBEIREvaluator with these parameters:
    {
        "dataset_names": [
            "msmarco",
            "nfcorpus",
            "nq"
        ],
        "rerank_k": 100,
        "at_k": 10,
        "always_rerank_positives": true
    }
    
Metric Value
map 0.3639 (-0.0261)
mrr@10 0.3959 (-0.0722)
ndcg@10 0.3961 (-0.0592)

Training Details

Training Dataset

Unnamed Dataset

  • Size: 2,749,365 training samples
  • Columns: question, answer, and label
  • Approximate statistics based on the first 1000 samples:
    question answer label
    type string string int
    details
    • min: 18 characters
    • mean: 43.62 characters
    • max: 83 characters
    • min: 57 characters
    • mean: 250.11 characters
    • max: 396 characters
    • 0: ~82.10%
    • 1: ~17.90%
  • Samples:
    question answer label
    can you still get pregnant if you are infertile? Many infertile couples will go on to conceive a child without treatment. After trying to get pregnant for two years, about 95 percent of couples successfully conceive. 1
    can you still get pregnant if you are infertile? Secondary infertility is the inability to become pregnant or to carry a baby to term after previously giving birth to a baby. Secondary infertility shares many of the same causes of primary infertility. Secondary infertility might be caused by: Impaired sperm production, function or delivery in men. 0
    can you still get pregnant if you are infertile? Problems with cervical mucus can interfere with getting pregnant. Mild cases may increase the time it takes to get pregnant, but won't necessarily cause infertility. 0
  • Loss: BinaryCrossEntropyLoss with these parameters:
    {
        "activation_fn": "torch.nn.modules.linear.Identity",
        "pos_weight": 5
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 256
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • seed: 12
  • bf16: True
  • dataloader_num_workers: 12
  • load_best_model_at_end: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 256
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 12
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 12
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss gooaq-dev_ndcg@10 NanoMSMARCO_R100_ndcg@10 NanoNFCorpus_R100_ndcg@10 NanoNQ_R100_ndcg@10 NanoBEIR_R100_mean_ndcg@10
-1 -1 - 0.1022 (-0.3304) 0.0716 (-0.4688) 0.2417 (-0.0833) 0.0286 (-0.4720) 0.1140 (-0.3414)
0.0001 1 1.3449 - - - - -
0.0186 200 1.2174 - - - - -
0.0372 400 1.156 - - - - -
0.0559 600 0.8504 - - - - -
0.0745 800 0.7192 - - - - -
0.0931 1000 0.6675 0.5936 (+0.1609) 0.4428 (-0.0976) 0.3140 (-0.0110) 0.4316 (-0.0691) 0.3961 (-0.0592)

Framework Versions

  • Python: 3.11.0
  • Sentence Transformers: 4.0.1
  • Transformers: 4.50.3
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.5.2
  • Datasets: 3.5.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
Downloads last month
6
Safetensors
Model size
150M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ayushexel/ce-modernbert-trained-1epoch

Finetuned
(529)
this model

Evaluation results