CrossEncoder based on answerdotai/ModernBERT-base

This is a Cross Encoder model finetuned from answerdotai/ModernBERT-base using the sentence-transformers library. It computes scores for pairs of texts, which can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Cross Encoder
  • Base model: answerdotai/ModernBERT-base
  • Maximum Sequence Length: 8192 tokens
  • Number of Output Labels: 1 label

Model Sources

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import CrossEncoder

# Download from the 🤗 Hub
model = CrossEncoder("sentence_transformers_model_id")
# Get scores for pairs of texts
pairs = [
    ["how to obtain a teacher's certificate in texas?", '["Step 1: Obtain a Bachelor\'s Degree. One of the most important Texas teacher qualifications is a bachelor\'s degree. ... ", \'Step 2: Complete an Educator Preparation Program (EPP) ... \', \'Step 3: Pass Texas Teacher Certification Exams. ... \', \'Step 4: Complete a Final Application and Background Check.\']'],
    ["how to obtain a teacher's certificate in texas?", 'Teacher education programs may take 4 years to complete after which certification plans are prepared for a three year period. During this plan period, the teacher must obtain a Standard Certification within 1-2 years. Learn how to get certified to teach in Texas.'],
    ["how to obtain a teacher's certificate in texas?", "Washington Teachers Licensing Application Process Official transcripts showing proof of bachelor's degree. Proof of teacher program completion at an approved teacher preparation school. Passing scores on the required examinations. Completed application for teacher certification in Washington."],
    ["how to obtain a teacher's certificate in texas?", 'Some aspiring educators may be confused about the difference between teaching certification and teaching certificates. Teacher certification is another term for the licensure required to teach in public schools, while a teaching certificate is awarded upon completion of an academic program.'],
    ["how to obtain a teacher's certificate in texas?", 'In Texas, the minimum age to work is 14. Unlike some states, Texas does not require juvenile workers to obtain a child employment certificate or an age certificate to work. A prospective employer that wants one can request a certificate of age for any minors it employs, obtainable from the Texas Workforce Commission.'],
]
scores = model.predict(pairs)
print(scores.shape)
# (5,)

# Or rank different texts based on similarity to a single text
ranks = model.rank(
    "how to obtain a teacher's certificate in texas?",
    [
        '["Step 1: Obtain a Bachelor\'s Degree. One of the most important Texas teacher qualifications is a bachelor\'s degree. ... ", \'Step 2: Complete an Educator Preparation Program (EPP) ... \', \'Step 3: Pass Texas Teacher Certification Exams. ... \', \'Step 4: Complete a Final Application and Background Check.\']',
        'Teacher education programs may take 4 years to complete after which certification plans are prepared for a three year period. During this plan period, the teacher must obtain a Standard Certification within 1-2 years. Learn how to get certified to teach in Texas.',
        "Washington Teachers Licensing Application Process Official transcripts showing proof of bachelor's degree. Proof of teacher program completion at an approved teacher preparation school. Passing scores on the required examinations. Completed application for teacher certification in Washington.",
        'Some aspiring educators may be confused about the difference between teaching certification and teaching certificates. Teacher certification is another term for the licensure required to teach in public schools, while a teaching certificate is awarded upon completion of an academic program.',
        'In Texas, the minimum age to work is 14. Unlike some states, Texas does not require juvenile workers to obtain a child employment certificate or an age certificate to work. A prospective employer that wants one can request a certificate of age for any minors it employs, obtainable from the Texas Workforce Commission.',
    ]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]

Evaluation

Metrics

Cross Encoder Reranking

Metric gooaq-dev NanoMSMARCO NanoNFCorpus NanoNQ
map 0.7821 (+0.2485) 0.4373 (-0.0523) 0.3354 (+0.0650) 0.5305 (+0.1098)
mrr@10 0.7800 (+0.2560) 0.4288 (-0.0487) 0.4934 (-0.0064) 0.5326 (+0.1059)
ndcg@10 0.8269 (+0.2356) 0.5287 (-0.0117) 0.3612 (+0.0361) 0.5823 (+0.0817)

Cross Encoder Nano BEIR

Metric Value
map 0.4344 (+0.0408)
mrr@10 0.4849 (+0.0169)
ndcg@10 0.4907 (+0.0354)

Training Details

Training Dataset

Unnamed Dataset

  • Size: 578,402 training samples
  • Columns: question, answer, and label
  • Approximate statistics based on the first 1000 samples:
    question answer label
    type string string int
    details
    • min: 19 characters
    • mean: 43.6 characters
    • max: 100 characters
    • min: 56 characters
    • mean: 251.22 characters
    • max: 387 characters
    • 0: ~82.90%
    • 1: ~17.10%
  • Samples:
    question answer label
    how to obtain a teacher's certificate in texas? ["Step 1: Obtain a Bachelor's Degree. One of the most important Texas teacher qualifications is a bachelor's degree. ... ", 'Step 2: Complete an Educator Preparation Program (EPP) ... ', 'Step 3: Pass Texas Teacher Certification Exams. ... ', 'Step 4: Complete a Final Application and Background Check.'] 1
    how to obtain a teacher's certificate in texas? Teacher education programs may take 4 years to complete after which certification plans are prepared for a three year period. During this plan period, the teacher must obtain a Standard Certification within 1-2 years. Learn how to get certified to teach in Texas. 0
    how to obtain a teacher's certificate in texas? Washington Teachers Licensing Application Process Official transcripts showing proof of bachelor's degree. Proof of teacher program completion at an approved teacher preparation school. Passing scores on the required examinations. Completed application for teacher certification in Washington. 0
  • Loss: BinaryCrossEntropyLoss with these parameters:
    {
        "activation_fct": "torch.nn.modules.linear.Identity",
        "pos_weight": 5
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • seed: 12
  • bf16: True
  • dataloader_num_workers: 4
  • load_best_model_at_end: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 12
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 4
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss gooaq-dev_ndcg@10 NanoMSMARCO_ndcg@10 NanoNFCorpus_ndcg@10 NanoNQ_ndcg@10 NanoBEIR_mean_ndcg@10
-1 -1 - 0.1541 (-0.4371) 0.0273 (-0.5131) 0.3068 (-0.0182) 0.0340 (-0.4666) 0.1227 (-0.3326)
0.0001 1 1.3693 - - - - -
0.0221 200 1.1942 - - - - -
0.0443 400 1.1542 - - - - -
0.0664 600 0.9421 - - - - -
0.0885 800 0.7253 - - - - -
0.1106 1000 0.6955 0.7578 (+0.1666) 0.4930 (-0.0474) 0.3038 (-0.0212) 0.6047 (+0.1040) 0.4672 (+0.0118)
0.1328 1200 0.6236 - - - - -
0.1549 1400 0.6155 - - - - -
0.1770 1600 0.6102 - - - - -
0.1992 1800 0.5621 - - - - -
0.2213 2000 0.571 0.7910 (+0.1998) 0.5230 (-0.0174) 0.3468 (+0.0217) 0.5689 (+0.0683) 0.4796 (+0.0242)
0.2434 2200 0.5575 - - - - -
0.2655 2400 0.5539 - - - - -
0.2877 2600 0.5507 - - - - -
0.3098 2800 0.5483 - - - - -
0.3319 3000 0.5204 0.8089 (+0.2177) 0.5283 (-0.0121) 0.3413 (+0.0162) 0.5783 (+0.0776) 0.4826 (+0.0272)
0.3541 3200 0.5267 - - - - -
0.3762 3400 0.5075 - - - - -
0.3983 3600 0.5312 - - - - -
0.4204 3800 0.4992 - - - - -
0.4426 4000 0.5019 0.8119 (+0.2207) 0.5021 (-0.0383) 0.3405 (+0.0155) 0.5255 (+0.0249) 0.4561 (+0.0007)
0.4647 4200 0.4957 - - - - -
0.4868 4400 0.5112 - - - - -
0.5090 4600 0.4992 - - - - -
0.5311 4800 0.4767 - - - - -
0.5532 5000 0.4854 0.8197 (+0.2284) 0.5562 (+0.0158) 0.3506 (+0.0256) 0.5767 (+0.0761) 0.4945 (+0.0392)
0.5753 5200 0.4834 - - - - -
0.5975 5400 0.4732 - - - - -
0.6196 5600 0.4757 - - - - -
0.6417 5800 0.4704 - - - - -
0.6639 6000 0.4632 0.8187 (+0.2275) 0.5322 (-0.0082) 0.3650 (+0.0399) 0.5871 (+0.0865) 0.4948 (+0.0394)
0.6860 6200 0.4492 - - - - -
0.7081 6400 0.4717 - - - - -
0.7303 6600 0.4639 - - - - -
0.7524 6800 0.465 - - - - -
0.7745 7000 0.4502 0.8261 (+0.2349) 0.5455 (+0.0050) 0.3540 (+0.0290) 0.6095 (+0.1089) 0.5030 (+0.0476)
0.7966 7200 0.4582 - - - - -
0.8188 7400 0.4628 - - - - -
0.8409 7600 0.4496 - - - - -
0.8630 7800 0.4571 - - - - -
0.8852 8000 0.4459 0.8239 (+0.2326) 0.5236 (-0.0168) 0.3571 (+0.0320) 0.5826 (+0.0819) 0.4878 (+0.0324)
0.9073 8200 0.457 - - - - -
0.9294 8400 0.4481 - - - - -
0.9515 8600 0.4515 - - - - -
0.9737 8800 0.4453 - - - - -
0.9958 9000 0.4566 0.8269 (+0.2356) 0.5287 (-0.0117) 0.3612 (+0.0361) 0.5823 (+0.0817) 0.4907 (+0.0354)
-1 -1 - 0.8269 (+0.2356) 0.5287 (-0.0117) 0.3612 (+0.0361) 0.5823 (+0.0817) 0.4907 (+0.0354)
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.11.10
  • Sentence Transformers: 3.5.0.dev0
  • Transformers: 4.49.0.dev0
  • PyTorch: 2.6.0.dev20241112+cu121
  • Accelerate: 1.2.0
  • Datasets: 3.2.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
Downloads last month
14
Safetensors
Model size
150M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support text-classification models for sentence-transformers library.

Model tree for tomaarsen/reranker-ModernBERT-base-gooaq-bce-static-retriever-hardest

Finetuned
(355)
this model