SentenceTransformer based on google/embeddinggemma-300m

This is a sentence-transformers model finetuned from google/embeddinggemma-300m. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: google/embeddinggemma-300m
  • Maximum Sequence Length: 2048 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 2048, 'do_lower_case': False, 'architecture': 'Gemma3TextModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Dense({'in_features': 768, 'out_features': 3072, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
  (3): Dense({'in_features': 3072, 'out_features': 768, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
  (4): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Neelkumar/my-embedding-gemma-5424")
# Run inference
queries = [
    "How can I find information about past Access to Information requests?",
]
documents = [
    'Search the summaries of completed Access to Information (ATI) requests to find information about ATI requests made to the Government of Canada after January 2020.',
    'What are the eligibility requirements for the Canada Pension Plan?',
    'This house style was a popular design from 1890-1900.',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 768] [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[ 0.9569,  0.1398, -0.0558]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 5,424 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 6 tokens
    • mean: 15.8 tokens
    • max: 35 tokens
    • min: 8 tokens
    • mean: 32.04 tokens
    • max: 130 tokens
    • min: 11 tokens
    • mean: 15.01 tokens
    • max: 42 tokens
  • Samples:
    anchor positive negative
    Quelles mesures les propriétaires peuvent-ils prendre pour éliminer les punaises de lit? Les propriétaires peuvent instaurer différentes mesures pour prévenir et éliminer les punaises des lits. Quelles sont les conditions pour obtenir une assurance automobile?
    Comment les pages web du gouvernement de la Saskatchewan sont-elles traduites en français? Un certain nombre de pages sur le site web du gouvernement de la Saskatchewan ont été traduites professionnellement en français. Quelles sont les exigences pour obtenir un permis de conduire?
    How long do plant breeders' rights last in Canada? Plant breeders receive legal protection for up to 25 years for trees and vines, and 20 years for other plant varieties. What are the requirements for importing a pet into Canada?
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 4
  • learning_rate: 2e-05
  • num_train_epochs: 10
  • warmup_ratio: 0.1
  • prompts: task: sentence similarity | query:

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: task: sentence similarity | query:
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0147 20 0.1138
0.0295 40 0.0682
0.0442 60 0.0099
0.0590 80 0.0212
0.0737 100 0.0447
0.0885 120 0.0047
0.1032 140 0.0057
0.1180 160 0.0025
0.1327 180 0.0036
0.1475 200 0.0062
0.1622 220 0.0285
0.1770 240 0.0069
0.1917 260 0.0008
0.2065 280 0.0104
0.2212 300 0.0019
0.2360 320 0.0576
0.2507 340 0.0088
0.2655 360 0.0046
0.2802 380 0.0014
0.2950 400 0.001
0.3097 420 0.0184
0.3245 440 0.0016
0.3392 460 0.0019
0.3540 480 0.0192
0.3687 500 0.0392
0.3835 520 0.0051
0.3982 540 0.0023
0.4130 560 0.0119
0.4277 580 0.0022
0.4425 600 0.0046
0.4572 620 0.0041
0.4720 640 0.0066
0.4867 660 0.0115
0.5015 680 0.0112
0.5162 700 0.0327
0.5310 720 0.0009
0.5457 740 0.0031
0.5605 760 0.0007
0.5752 780 0.0367
0.5900 800 0.0344
0.6047 820 0.0027
0.6195 840 0.0105
0.6342 860 0.0597
0.6490 880 0.0594
0.6637 900 0.0022
0.6785 920 0.0177
0.6932 940 0.0041
0.7080 960 0.0123
0.7227 980 0.0988
0.7375 1000 0.0248
0.7522 1020 0.0021
0.7670 1040 0.0376
0.7817 1060 0.0216
0.7965 1080 0.0779
0.8112 1100 0.0317
0.8260 1120 0.0233
0.8407 1140 0.0201
0.8555 1160 0.1391
0.8702 1180 0.0846
0.8850 1200 0.0064
0.8997 1220 0.1509
0.9145 1240 0.0196
0.9292 1260 0.0198
0.9440 1280 0.0174
0.9587 1300 0.117
0.9735 1320 0.0741
0.9882 1340 0.3282
1.0029 1360 0.0314
1.0177 1380 0.1522
1.0324 1400 0.0378
1.0472 1420 0.025
1.0619 1440 0.0442
1.0767 1460 0.0314
1.0914 1480 0.0745
1.1062 1500 0.0272
1.1209 1520 0.1248
1.1357 1540 0.299
1.1504 1560 0.0123
1.1652 1580 0.0245
1.1799 1600 0.0153
1.1947 1620 0.0171
1.2094 1640 0.0146
1.2242 1660 0.0313
1.2389 1680 0.0317
1.2537 1700 0.084
1.2684 1720 0.0569
1.2832 1740 0.1958
1.2979 1760 0.09
1.3127 1780 0.0526
1.3274 1800 0.0956
1.3422 1820 0.1601
1.3569 1840 0.156
1.3717 1860 0.0296
1.3864 1880 0.0391
1.4012 1900 0.0816
1.4159 1920 0.1262
1.4307 1940 0.1375
1.4454 1960 0.3373
1.4602 1980 0.094
1.4749 2000 0.0875
1.4897 2020 0.1161
1.5044 2040 0.1739
1.5192 2060 0.0526
1.5339 2080 0.1364
1.5487 2100 0.0508
1.5634 2120 0.0614
1.5782 2140 0.0589
1.5929 2160 0.0593
1.6077 2180 0.0078
1.6224 2200 0.2009
1.6372 2220 0.1356
1.6519 2240 0.1268
1.6667 2260 0.0257
1.6814 2280 0.0679
1.6962 2300 0.0229
1.7109 2320 0.1467
1.7257 2340 0.1239
1.7404 2360 0.0138
1.7552 2380 0.0997
1.7699 2400 0.0197
1.7847 2420 0.0358
1.7994 2440 0.0368
1.8142 2460 0.0755
1.8289 2480 0.1305
1.8437 2500 0.0164
1.8584 2520 0.1273
1.8732 2540 0.0255
1.8879 2560 0.0547
1.9027 2580 0.0494
1.9174 2600 0.1257
1.9322 2620 0.0434
1.9469 2640 0.0358
1.9617 2660 0.1272
1.9764 2680 0.022
1.9912 2700 0.054
2.0059 2720 0.0281
2.0206 2740 0.0229
2.0354 2760 0.0117
2.0501 2780 0.0242
2.0649 2800 0.0819
2.0796 2820 0.0625
2.0944 2840 0.0622
2.1091 2860 0.0316
2.1239 2880 0.2254
2.1386 2900 0.0857
2.1534 2920 0.026
2.1681 2940 0.0023
2.1829 2960 0.0053
2.1976 2980 0.004
2.2124 3000 0.0087
2.2271 3020 0.0068
2.2419 3040 0.0207
2.2566 3060 0.0522
2.2714 3080 0.005
2.2861 3100 0.038
2.3009 3120 0.0059
2.3156 3140 0.035
2.3304 3160 0.0603
2.3451 3180 0.0209
2.3599 3200 0.0103
2.3746 3220 0.0109
2.3894 3240 0.0755
2.4041 3260 0.0021
2.4189 3280 0.1019
2.4336 3300 0.1014
2.4484 3320 0.0198
2.4631 3340 0.0205
2.4779 3360 0.0431
2.4926 3380 0.1268
2.5074 3400 0.0097
2.5221 3420 0.0035
2.5369 3440 0.0292
2.5516 3460 0.0175
2.5664 3480 0.0687
2.5811 3500 0.021
2.5959 3520 0.0438
2.6106 3540 0.0024
2.6254 3560 0.0029
2.6401 3580 0.0267
2.6549 3600 0.0288
2.6696 3620 0.0058
2.6844 3640 0.0634
2.6991 3660 0.0404
2.7139 3680 0.0253
2.7286 3700 0.0127
2.7434 3720 0.0786
2.7581 3740 0.0739
2.7729 3760 0.0348
2.7876 3780 0.0697
2.8024 3800 0.0143
2.8171 3820 0.015
2.8319 3840 0.0139
2.8466 3860 0.023
2.8614 3880 0.0625
2.8761 3900 0.01
2.8909 3920 0.0656
2.9056 3940 0.0435
2.9204 3960 0.0367
2.9351 3980 0.0482
2.9499 4000 0.0557
2.9646 4020 0.1046
2.9794 4040 0.0578
2.9941 4060 0.0793
3.0088 4080 0.0053
3.0236 4100 0.0035
3.0383 4120 0.0095
3.0531 4140 0.001
3.0678 4160 0.0368
3.0826 4180 0.0251
3.0973 4200 0.0084
3.1121 4220 0.0224
3.1268 4240 0.0407
3.1416 4260 0.0611
3.1563 4280 0.0226
3.1711 4300 0.0092
3.1858 4320 0.0052
3.2006 4340 0.0578
3.2153 4360 0.0259
3.2301 4380 0.0002
3.2448 4400 0.0787
3.2596 4420 0.0194
3.2743 4440 0.0002
3.2891 4460 0.0006
3.3038 4480 0.0188
3.3186 4500 0.0722
3.3333 4520 0.0621
3.3481 4540 0.0017
3.3628 4560 0.1242
3.3776 4580 0.0057
3.3923 4600 0.0064
3.4071 4620 0.0016
3.4218 4640 0.0007
3.4366 4660 0.1187
3.4513 4680 0.0529
3.4661 4700 0.0294
3.4808 4720 0.1213
3.4956 4740 0.0221
3.5103 4760 0.0234
3.5251 4780 0.0034
3.5398 4800 0.0107
3.5546 4820 0.012
3.5693 4840 0.0351
3.5841 4860 0.0099
3.5988 4880 0.002
3.6136 4900 0.0024
3.6283 4920 0.0321
3.6431 4940 0.0008
3.6578 4960 0.038
3.6726 4980 0.0944
3.6873 5000 0.0227
3.7021 5020 0.0088
3.7168 5040 0.0573
3.7316 5060 0.2029
3.7463 5080 0.0522
3.7611 5100 0.012
3.7758 5120 0.0044
3.7906 5140 0.0178
3.8053 5160 0.0032
3.8201 5180 0.0375
3.8348 5200 0.0322
3.8496 5220 0.0066
3.8643 5240 0.0108
3.8791 5260 0.0143
3.8938 5280 0.0271
3.9086 5300 0.003
3.9233 5320 0.0183
3.9381 5340 0.0307
3.9528 5360 0.0026
3.9676 5380 0.0031
3.9823 5400 0.0011
3.9971 5420 0.0749
4.0118 5440 0.0192
4.0265 5460 0.037
4.0413 5480 0.0017
4.0560 5500 0.0013
4.0708 5520 0.0246
4.0855 5540 0.0007
4.1003 5560 0.045
4.1150 5580 0.038
4.1298 5600 0.0179
4.1445 5620 0.021
4.1593 5640 0.0012
4.1740 5660 0.0001
4.1888 5680 0.0004
4.2035 5700 0.0001
4.2183 5720 0.0021
4.2330 5740 0.0279
4.2478 5760 0.0044
4.2625 5780 0.0063
4.2773 5800 0.0046
4.2920 5820 0.0692
4.3068 5840 0.0007
4.3215 5860 0.0053
4.3363 5880 0.0288
4.3510 5900 0.0197
4.3658 5920 0.0007
4.3805 5940 0.002
4.3953 5960 0.0059
4.4100 5980 0.0258
4.4248 6000 0.001
4.4395 6020 0.0017
4.4543 6040 0.0024
4.4690 6060 0.0748
4.4838 6080 0.002
4.4985 6100 0.0498
4.5133 6120 0.0016
4.5280 6140 0.0018
4.5428 6160 0.0022
4.5575 6180 0.0012
4.5723 6200 0.009
4.5870 6220 0.0659
4.6018 6240 0.0121
4.6165 6260 0.0294
4.6313 6280 0.0002
4.6460 6300 0.0184
4.6608 6320 0.0158
4.6755 6340 0.0104
4.6903 6360 0.0498
4.7050 6380 0.0061
4.7198 6400 0.0305
4.7345 6420 0.0427
4.7493 6440 0.0004
4.7640 6460 0.0009
4.7788 6480 0.0001
4.7935 6500 0.0261
4.8083 6520 0.0019
4.8230 6540 0.0024
4.8378 6560 0.0228
4.8525 6580 0.0002
4.8673 6600 0.002
4.8820 6620 0.0005
4.8968 6640 0.0082
4.9115 6660 0.0119
4.9263 6680 0.0175
4.9410 6700 0.0011
4.9558 6720 0.0021
4.9705 6740 0.0106
4.9853 6760 0.018
5.0 6780 0.019
5.0147 6800 0.0629
5.0295 6820 0.0076
5.0442 6840 0.0004
5.0590 6860 0.0014
5.0737 6880 0.0012
5.0885 6900 0.0021
5.1032 6920 0.0032
5.1180 6940 0.0275
5.1327 6960 0.019
5.1475 6980 0.0006
5.1622 7000 0.0006
5.1770 7020 0.0049
5.1917 7040 0.0359
5.2065 7060 0.0028
5.2212 7080 0.0012
5.2360 7100 0.0138
5.2507 7120 0.0042
5.2655 7140 0.0003
5.2802 7160 0.0056
5.2950 7180 0.0329
5.3097 7200 0.0016
5.3245 7220 0.0092
5.3392 7240 0.0002
5.3540 7260 0.0211
5.3687 7280 0.019
5.3835 7300 0.0012
5.3982 7320 0.0002
5.4130 7340 0.0002
5.4277 7360 0.0143
5.4425 7380 0.0004
5.4572 7400 0.0004
5.4720 7420 0.0068
5.4867 7440 0.0201
5.5015 7460 0.0003
5.5162 7480 0.0042
5.5310 7500 0.0007
5.5457 7520 0.0664
5.5605 7540 0.0014
5.5752 7560 0.0175
5.5900 7580 0.0362
5.6047 7600 0.0225
5.6195 7620 0.0003
5.6342 7640 0.0025
5.6490 7660 0.0128
5.6637 7680 0.0013
5.6785 7700 0.0042
5.6932 7720 0.0012
5.7080 7740 0.0017
5.7227 7760 0.0039
5.7375 7780 0.0013
5.7522 7800 0.0008
5.7670 7820 0.006
5.7817 7840 0.0177
5.7965 7860 0.0189
5.8112 7880 0.0015
5.8260 7900 0.0003
5.8407 7920 0.001
5.8555 7940 0.0269
5.8702 7960 0.0006
5.8850 7980 0.0176
5.8997 8000 0.0048
5.9145 8020 0.0031
5.9292 8040 0.0056
5.9440 8060 0.0015
5.9587 8080 0.0102
5.9735 8100 0.0047
5.9882 8120 0.0339
6.0029 8140 0.0027
6.0177 8160 0.0008
6.0324 8180 0.0014
6.0472 8200 0.0001
6.0619 8220 0.0183
6.0767 8240 0.0142
6.0914 8260 0.0004
6.1062 8280 0.0392
6.1209 8300 0.0016
6.1357 8320 0.0025
6.1504 8340 0.0017
6.1652 8360 0.018
6.1799 8380 0.0031
6.1947 8400 0.0021
6.2094 8420 0.0244
6.2242 8440 0.0263
6.2389 8460 0.0183
6.2537 8480 0.0367
6.2684 8500 0.0009
6.2832 8520 0.0
6.2979 8540 0.0001
6.3127 8560 0.0011
6.3274 8580 0.0007
6.3422 8600 0.0004
6.3569 8620 0.0044
6.3717 8640 0.0174
6.3864 8660 0.0002
6.4012 8680 0.0176
6.4159 8700 0.0341
6.4307 8720 0.0015
6.4454 8740 0.0002
6.4602 8760 0.0043
6.4749 8780 0.0036
6.4897 8800 0.0001
6.5044 8820 0.0004
6.5192 8840 0.0474
6.5339 8860 0.0001
6.5487 8880 0.0003
6.5634 8900 0.0021
6.5782 8920 0.0014
6.5929 8940 0.0004
6.6077 8960 0.0176
6.6224 8980 0.0001
6.6372 9000 0.0009
6.6519 9020 0.0015
6.6667 9040 0.0003
6.6814 9060 0.0001
6.6962 9080 0.0016
6.7109 9100 0.0182
6.7257 9120 0.0002
6.7404 9140 0.0009
6.7552 9160 0.0018
6.7699 9180 0.0182
6.7847 9200 0.0
6.7994 9220 0.0206
6.8142 9240 0.0001
6.8289 9260 0.0002
6.8437 9280 0.0037
6.8584 9300 0.0238
6.8732 9320 0.0002
6.8879 9340 0.0
6.9027 9360 0.0002
6.9174 9380 0.019
6.9322 9400 0.0059
6.9469 9420 0.0002
6.9617 9440 0.0001
6.9764 9460 0.0004
6.9912 9480 0.0023
7.0059 9500 0.0006
7.0206 9520 0.0019
7.0354 9540 0.0176
7.0501 9560 0.0026
7.0649 9580 0.0014
7.0796 9600 0.0003
7.0944 9620 0.0001
7.1091 9640 0.0002
7.1239 9660 0.0362
7.1386 9680 0.001
7.1534 9700 0.0001
7.1681 9720 0.0002
7.1829 9740 0.0029
7.1976 9760 0.0002
7.2124 9780 0.0003
7.2271 9800 0.0027
7.2419 9820 0.0001
7.2566 9840 0.0001
7.2714 9860 0.0002
7.2861 9880 0.0124
7.3009 9900 0.0361
7.3156 9920 0.0039
7.3304 9940 0.0
7.3451 9960 0.0
7.3599 9980 0.0008
7.3746 10000 0.0002
7.3894 10020 0.0003
7.4041 10040 0.0001
7.4189 10060 0.0174
7.4336 10080 0.0015
7.4484 10100 0.0152
7.4631 10120 0.0351
7.4779 10140 0.0007
7.4926 10160 0.0005
7.5074 10180 0.0005
7.5221 10200 0.0001
7.5369 10220 0.0002
7.5516 10240 0.0001
7.5664 10260 0.001
7.5811 10280 0.0057
7.5959 10300 0.0012
7.6106 10320 0.0001
7.6254 10340 0.0005
7.6401 10360 0.0016
7.6549 10380 0.0072
7.6696 10400 0.0007
7.6844 10420 0.0001
7.6991 10440 0.0002
7.7139 10460 0.0036
7.7286 10480 0.0001
7.7434 10500 0.0002
7.7581 10520 0.0001
7.7729 10540 0.0001
7.7876 10560 0.0007
7.8024 10580 0.0002
7.8171 10600 0.0001
7.8319 10620 0.018
7.8466 10640 0.0882
7.8614 10660 0.0006
7.8761 10680 0.0001
7.8909 10700 0.0001
7.9056 10720 0.0001
7.9204 10740 0.0176
7.9351 10760 0.0002
7.9499 10780 0.0231
7.9646 10800 0.0002
7.9794 10820 0.0002
7.9941 10840 0.0
8.0088 10860 0.0001
8.0236 10880 0.0001
8.0383 10900 0.0003
8.0531 10920 0.0172
8.0678 10940 0.0002
8.0826 10960 0.018
8.0973 10980 0.0174
8.1121 11000 0.0001
8.1268 11020 0.0174
8.1416 11040 0.0
8.1563 11060 0.0039
8.1711 11080 0.0001
8.1858 11100 0.0
8.2006 11120 0.002
8.2153 11140 0.0176
8.2301 11160 0.0022
8.2448 11180 0.0001
8.2596 11200 0.0
8.2743 11220 0.0027
8.2891 11240 0.0198
8.3038 11260 0.0
8.3186 11280 0.0003
8.3333 11300 0.0223
8.3481 11320 0.0092
8.3628 11340 0.0001
8.3776 11360 0.0009
8.3923 11380 0.0014
8.4071 11400 0.0006
8.4218 11420 0.0006
8.4366 11440 0.0006
8.4513 11460 0.0005
8.4661 11480 0.0192
8.4808 11500 0.0347
8.4956 11520 0.0009
8.5103 11540 0.0002
8.5251 11560 0.0
8.5398 11580 0.0
8.5546 11600 0.0002
8.5693 11620 0.0174
8.5841 11640 0.0001
8.5988 11660 0.0171
8.6136 11680 0.0001
8.6283 11700 0.0001
8.6431 11720 0.0428
8.6578 11740 0.0003
8.6726 11760 0.0
8.6873 11780 0.0001
8.7021 11800 0.0176
8.7168 11820 0.0358
8.7316 11840 0.0002
8.7463 11860 0.0002
8.7611 11880 0.0001
8.7758 11900 0.0002
8.7906 11920 0.0015
8.8053 11940 0.0001
8.8201 11960 0.0001
8.8348 11980 0.0112
8.8496 12000 0.0033
8.8643 12020 0.0001
8.8791 12040 0.001
8.8938 12060 0.0174
8.9086 12080 0.0001
8.9233 12100 0.0002
8.9381 12120 0.0001
8.9528 12140 0.0001
8.9676 12160 0.0222
8.9823 12180 0.0003
8.9971 12200 0.0001
9.0118 12220 0.0
9.0265 12240 0.0001
9.0413 12260 0.0182
9.0560 12280 0.0174
9.0708 12300 0.0
9.0855 12320 0.0
9.1003 12340 0.0023
9.1150 12360 0.0001
9.1298 12380 0.0248
9.1445 12400 0.0
9.1593 12420 0.0
9.1740 12440 0.0
9.1888 12460 0.0001
9.2035 12480 0.0087
9.2183 12500 0.0
9.2330 12520 0.0003
9.2478 12540 0.0174
9.2625 12560 0.0
9.2773 12580 0.0006
9.2920 12600 0.0001
9.3068 12620 0.0053
9.3215 12640 0.0
9.3363 12660 0.0174
9.3510 12680 0.0001
9.3658 12700 0.0002
9.3805 12720 0.0001
9.3953 12740 0.0001
9.4100 12760 0.0001
9.4248 12780 0.0002
9.4395 12800 0.0002
9.4543 12820 0.0023
9.4690 12840 0.0
9.4838 12860 0.0018
9.4985 12880 0.0028
9.5133 12900 0.0174
9.5280 12920 0.0001
9.5428 12940 0.0001
9.5575 12960 0.0174
9.5723 12980 0.0003
9.5870 13000 0.0
9.6018 13020 0.0174
9.6165 13040 0.0001
9.6313 13060 0.0
9.6460 13080 0.0001
9.6608 13100 0.0174
9.6755 13120 0.0173
9.6903 13140 0.0
9.7050 13160 0.0005
9.7198 13180 0.0001
9.7345 13200 0.0002
9.7493 13220 0.0
9.7640 13240 0.0001
9.7788 13260 0.0
9.7935 13280 0.0026
9.8083 13300 0.0003
9.8230 13320 0.0001
9.8378 13340 0.0174
9.8525 13360 0.0099
9.8673 13380 0.0002
9.8820 13400 0.0
9.8968 13420 0.0032
9.9115 13440 0.0177
9.9263 13460 0.0175
9.9410 13480 0.0176
9.9558 13500 0.0001
9.9705 13520 0.0
9.9853 13540 0.0011
10.0 13560 0.0174

Framework Versions

  • Python: 3.11.13
  • Sentence Transformers: 5.1.1
  • Transformers: 4.57.0.dev0
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.8.1
  • Datasets: 3.6.0
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
18
Safetensors
Model size
303M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Neelkumar/my-embedding-gemma-5424

Finetuned
(99)
this model