Edit model card

SentenceTransformer based on Alibaba-NLP/gte-multilingual-base

This is a sentence-transformers model finetuned from Alibaba-NLP/gte-multilingual-base. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: Alibaba-NLP/gte-multilingual-base
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: NewModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("seongil-dn/gte-noneg-bs512-lr5e-5-1000")
# Run inference
sentences = [
    'LPGA 투어에서 고진영이 컷 탈락을 기록한 건 얼마나 돼',
    '여자골프 세계랭킹 1위 고진영(26)이 미국여자프로골프(LPGA) 투어 드라이브온 챔피언십(총상금 150만 달러)에서 컷 탈락했다. 고진영은 6일(한국시간) 미국 플로리다주 오칼라의 골든 오칼라 골프 클럽(파72ㆍ6,526야드)에서 열린 대회 2라운드에서 버디와 보기 하나씩을 묶어 이븐파 72타를 쳤다. 1라운드 3오버파 75타로 공동 86위에 그쳤던 고진영은 이틀간 합계 3오버파 147타로 공동 72위에 머물러 컷을 통과하지 못했다. 컷은 2오버파 146타였다. 고진영이 LPGA 투어 대회에서 컷 탈락한 건 세 번째다. 앞서 2017년 3월 ANA 인스피레이션, 2018년 8월 브리티시여자오픈에서 컷을 통과하지 못했다. 그리고 2년 7개월 만에 또 한 번 컷 탈락이 기록됐다. 이날 2라운드는 10번 홀에서 시작, 15번 홀(파3) 버디를 잡아냈으나 17번 홀(파4) 보기를 써내 전반 타수를 줄이지 못했고, 후반엔 9개 홀 모두 파를 기록했다. 그는 이날 페어웨이는 한 번밖에 놓치지 않았으나 그린을 6차례 놓치고 퍼트 수가 30개에 달했다. 리더보드 맨 위엔 10언더파 134타의 제니퍼 컵초, 오스틴 언스트(이상 미국)가 이름을 올린 가운데 데일리 베스트인 7언더파를 몰아친 카를로타 시간다(스페인ㆍ8언더파 136타)가 두 타 차로 추격했다. 한국 선수 중에는 허미정(32)이 3언더파 141타, 공동 11위로 가장 좋은 성적을 냈다. 세계랭킹 2위 김세영(28)은 공동 17위(2언더파 142타), 전인지(27)는 공동 24위(1언더파 143타)에 자리했다. 정은(25)은 5타, 박성현(28)은 한 타를 잃고 공동 58위(2오버파 146타)에 올라 가까스로 컷을 통과했다.',
    '1회용품 함께 줄이기 계획\nⅠ. 추진 배경\n□ (그간 추진 경과) ‘자원의 절약 및 재활용 촉진에 관한 법률’에 따라 1회용품 사용억제 제도 운영(1994~, 18개품목-18개업종)\no (성과) 「재활용 폐기물 관리 종합대책」(2018.5)을 수립하고 1회용컵, 비닐봉투 사용저감을 집중 추진하여 일정 감축성과 창출\n* 커피전문점 매장 내 1회용컵 75% 감소, 제과점 1회용 비닐봉투 84% 감소 등\no (한계) 그러나 국민이 체감할 변화는 아직 미흡하며, 비 규제 품목(빨대 등) 및 유형(배달 등)에 대한 관리 강화 요구 증가\n□ (해외 동향) 세계 각 국은 1회용품 사용을 저감하기 위한 중장기 로드맵을 발표하고, 국가별로 다양한 규제방안 도입\n* EU는 1회용 플라스틱 10대 품목 선정, 품목별 시장출시 금지 등 규제방안 마련\n** 미국 일부 州, 캐나다, 프랑스, 케냐, 칠레 등 1회용 비닐봉투 등 사용금지 도입',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • warmup_steps: 100
  • bf16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 100
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss
0.0011 1 0.4348
0.0021 2 0.4712
0.0032 3 0.4947
0.0042 4 0.4267
0.0053 5 0.4421
0.0064 6 0.4834
0.0074 7 0.4726
0.0085 8 0.4524
0.0096 9 0.4645
0.0106 10 0.4654
0.0117 11 0.4574
0.0127 12 0.5019
0.0138 13 0.4481
0.0149 14 0.423
0.0159 15 0.4694
0.0170 16 0.4316
0.0180 17 0.4372
0.0191 18 0.4623
0.0202 19 0.4222
0.0212 20 0.4494
0.0223 21 0.3932
0.0234 22 0.3924
0.0244 23 0.3869
0.0255 24 0.4064
0.0265 25 0.3945
0.0276 26 0.382
0.0287 27 0.3684
0.0297 28 0.3881
0.0308 29 0.3784
0.0318 30 0.3715
0.0329 31 0.34
0.0340 32 0.3421
0.0350 33 0.3678
0.0361 34 0.3489
0.0372 35 0.3112
0.0382 36 0.3137
0.0393 37 0.2928
0.0403 38 0.3053
0.0414 39 0.2838
0.0425 40 0.2638
0.0435 41 0.2827
0.0446 42 0.2372
0.0456 43 0.2635
0.0467 44 0.2749
0.0478 45 0.2381
0.0488 46 0.2113
0.0499 47 0.1914
0.0510 48 0.1944
0.0520 49 0.1863
0.0531 50 0.191
0.0541 51 0.1547
0.0552 52 0.1854
0.0563 53 0.1587
0.0573 54 0.1555
0.0584 55 0.1563
0.0594 56 0.1711
0.0605 57 0.1432
0.0616 58 0.1263
0.0626 59 0.1247
0.0637 60 0.1369
0.0648 61 0.1305
0.0658 62 0.1022
0.0669 63 0.1191
0.0679 64 0.1083
0.0690 65 0.0936
0.0701 66 0.0988
0.0711 67 0.0942
0.0722 68 0.107
0.0732 69 0.0823
0.0743 70 0.0886
0.0754 71 0.1055
0.0764 72 0.1013
0.0775 73 0.0807
0.0786 74 0.0776
0.0796 75 0.0737
0.0807 76 0.0916
0.0817 77 0.0654
0.0828 78 0.0904
0.0839 79 0.0954
0.0849 80 0.0697
0.0860 81 0.0751
0.0870 82 0.0886
0.0881 83 0.0752
0.0892 84 0.0806
0.0902 85 0.0807
0.0913 86 0.0842
0.0924 87 0.0821
0.0934 88 0.0723
0.0945 89 0.0797
0.0955 90 0.0797
0.0966 91 0.0832
0.0977 92 0.0713
0.0987 93 0.0681
0.0998 94 0.0825
0.1008 95 0.0838
0.1019 96 0.0746
0.1030 97 0.0792
0.1040 98 0.0692
0.1051 99 0.0705
0.1062 100 0.0666
0.1072 101 0.0692
0.1083 102 0.0675
0.1093 103 0.0734
0.1104 104 0.072
0.1115 105 0.0565
0.1125 106 0.0663
0.1136 107 0.0789
0.1146 108 0.0605
0.1157 109 0.0671
0.1168 110 0.083
0.1178 111 0.071
0.1189 112 0.0759
0.1200 113 0.0604
0.1210 114 0.0682
0.1221 115 0.0531
0.1231 116 0.0779
0.1242 117 0.0646
0.1253 118 0.0621
0.1263 119 0.081
0.1274 120 0.0688
0.1285 121 0.055
0.1295 122 0.0513
0.1306 123 0.063
0.1316 124 0.0634
0.1327 125 0.075
0.1338 126 0.062
0.1348 127 0.0821
0.1359 128 0.0565
0.1369 129 0.0492
0.1380 130 0.0762
0.1391 131 0.0735
0.1401 132 0.069
0.1412 133 0.0619
0.1423 134 0.0789
0.1433 135 0.0621
0.1444 136 0.0568
0.1454 137 0.0717
0.1465 138 0.0764
0.1476 139 0.0502
0.1486 140 0.0626
0.1497 141 0.0615
0.1507 142 0.0555
0.1518 143 0.0674
0.1529 144 0.0635
0.1539 145 0.0553
0.1550 146 0.0525
0.1561 147 0.055
0.1571 148 0.0665
0.1582 149 0.0703
0.1592 150 0.0657
0.1603 151 0.0612
0.1614 152 0.0671
0.1624 153 0.059
0.1635 154 0.0636
0.1645 155 0.0753
0.1656 156 0.0931
0.1667 157 0.0531
0.1677 158 0.0558
0.1688 159 0.0599
0.1699 160 0.0501
0.1709 161 0.051
0.1720 162 0.0697
0.1730 163 0.074
0.1741 164 0.0607
0.1752 165 0.0611
0.1762 166 0.059
0.1773 167 0.073
0.1783 168 0.0541
0.1794 169 0.0576
0.1805 170 0.0656
0.1815 171 0.0499
0.1826 172 0.055
0.1837 173 0.0646
0.1847 174 0.0747
0.1858 175 0.0558
0.1868 176 0.0537
0.1879 177 0.0574
0.1890 178 0.061
0.1900 179 0.0743
0.1911 180 0.0553
0.1921 181 0.0603
0.1932 182 0.0613
0.1943 183 0.0557
0.1953 184 0.0629
0.1964 185 0.0524
0.1975 186 0.0533
0.1985 187 0.0624
0.1996 188 0.0566
0.2006 189 0.0446
0.2017 190 0.0578
0.2028 191 0.0487
0.2038 192 0.066
0.2049 193 0.0618
0.2059 194 0.0591
0.2070 195 0.0553
0.2081 196 0.052
0.2091 197 0.0451
0.2102 198 0.0633
0.2113 199 0.0658
0.2123 200 0.0623
0.2134 201 0.0593
0.2144 202 0.0491
0.2155 203 0.0526
0.2166 204 0.057
0.2176 205 0.0631
0.2187 206 0.0809
0.2197 207 0.063
0.2208 208 0.0571
0.2219 209 0.054
0.2229 210 0.0607
0.2240 211 0.056
0.2251 212 0.06
0.2261 213 0.0597
0.2272 214 0.0538
0.2282 215 0.0584
0.2293 216 0.0473
0.2304 217 0.052
0.2314 218 0.06
0.2325 219 0.0566
0.2335 220 0.0559
0.2346 221 0.0536
0.2357 222 0.0634
0.2367 223 0.0637
0.2378 224 0.056
0.2389 225 0.0504
0.2399 226 0.0371
0.2410 227 0.0678
0.2420 228 0.0569
0.2431 229 0.0551
0.2442 230 0.0486
0.2452 231 0.0536
0.2463 232 0.0615
0.2473 233 0.0535
0.2484 234 0.0502
0.2495 235 0.0571
0.2505 236 0.0593
0.2516 237 0.0557
0.2527 238 0.0671
0.2537 239 0.0609
0.2548 240 0.0667
0.2558 241 0.064
0.2569 242 0.0503
0.2580 243 0.0461
0.2590 244 0.059
0.2601 245 0.0594
0.2611 246 0.0577
0.2622 247 0.0664
0.2633 248 0.0736
0.2643 249 0.0506
0.2654 250 0.0611
0.2665 251 0.0657
0.2675 252 0.0543
0.2686 253 0.0595
0.2696 254 0.0531
0.2707 255 0.0552
0.2718 256 0.061
0.2728 257 0.0456
0.2739 258 0.0498
0.2749 259 0.0567
0.2760 260 0.0444
0.2771 261 0.0567
0.2781 262 0.0524
0.2792 263 0.0518
0.2803 264 0.0664
0.2813 265 0.0537
0.2824 266 0.0537
0.2834 267 0.0558
0.2845 268 0.0501
0.2856 269 0.0558
0.2866 270 0.0411
0.2877 271 0.0432
0.2887 272 0.0535
0.2898 273 0.0511
0.2909 274 0.0469
0.2919 275 0.0587
0.2930 276 0.052
0.2941 277 0.0594
0.2951 278 0.0651
0.2962 279 0.0486
0.2972 280 0.0602
0.2983 281 0.0567
0.2994 282 0.0547
0.3004 283 0.0669
0.3015 284 0.0543
0.3025 285 0.0616
0.3036 286 0.0532
0.3047 287 0.0689
0.3057 288 0.0461
0.3068 289 0.0516
0.3079 290 0.0496
0.3089 291 0.0581
0.3100 292 0.0446
0.3110 293 0.048
0.3121 294 0.0442
0.3132 295 0.0504
0.3142 296 0.0531
0.3153 297 0.0681
0.3163 298 0.0458
0.3174 299 0.0584
0.3185 300 0.064
0.3195 301 0.0595
0.3206 302 0.0604
0.3217 303 0.0621
0.3227 304 0.0466
0.3238 305 0.0545
0.3248 306 0.0523
0.3259 307 0.0496
0.3270 308 0.0468
0.3280 309 0.0649
0.3291 310 0.056
0.3301 311 0.0539
0.3312 312 0.0497
0.3323 313 0.0517
0.3333 314 0.0511
0.3344 315 0.0511
0.3355 316 0.0518
0.3365 317 0.0508
0.3376 318 0.0579
0.3386 319 0.0463
0.3397 320 0.046
0.3408 321 0.0461
0.3418 322 0.0469
0.3429 323 0.0399
0.3439 324 0.0516
0.3450 325 0.0551
0.3461 326 0.0497
0.3471 327 0.0455
0.3482 328 0.0534
0.3493 329 0.0437
0.3503 330 0.0542
0.3514 331 0.0462
0.3524 332 0.0429
0.3535 333 0.0542
0.3546 334 0.0452
0.3556 335 0.0569
0.3567 336 0.0495
0.3577 337 0.0443
0.3588 338 0.0543
0.3599 339 0.0671
0.3609 340 0.054
0.3620 341 0.0596
0.3631 342 0.0468
0.3641 343 0.0644
0.3652 344 0.044
0.3662 345 0.0477
0.3673 346 0.0403
0.3684 347 0.0553
0.3694 348 0.0533
0.3705 349 0.0447
0.3715 350 0.0527
0.3726 351 0.0465
0.3737 352 0.0518
0.3747 353 0.0345
0.3758 354 0.0515
0.3769 355 0.0438
0.3779 356 0.0489
0.3790 357 0.046
0.3800 358 0.0621
0.3811 359 0.0667
0.3822 360 0.0489
0.3832 361 0.0555
0.3843 362 0.0445
0.3854 363 0.0492
0.3864 364 0.0562
0.3875 365 0.0484
0.3885 366 0.0582
0.3896 367 0.0551
0.3907 368 0.0512
0.3917 369 0.0486
0.3928 370 0.0537
0.3938 371 0.0499
0.3949 372 0.0651
0.3960 373 0.0531
0.3970 374 0.0743
0.3981 375 0.052
0.3992 376 0.0476
0.4002 377 0.0572
0.4013 378 0.0555
0.4023 379 0.0569
0.4034 380 0.052
0.4045 381 0.0524
0.4055 382 0.0726
0.4066 383 0.0456
0.4076 384 0.0531
0.4087 385 0.0474
0.4098 386 0.0485
0.4108 387 0.0459
0.4119 388 0.0474
0.4130 389 0.0541
0.4140 390 0.0452
0.4151 391 0.0362
0.4161 392 0.0407
0.4172 393 0.0449
0.4183 394 0.0444
0.4193 395 0.0469
0.4204 396 0.0493
0.4214 397 0.0437
0.4225 398 0.0551
0.4236 399 0.0412
0.4246 400 0.0401
0.4257 401 0.0488
0.4268 402 0.0506
0.4278 403 0.0458
0.4289 404 0.0436
0.4299 405 0.0574
0.4310 406 0.0516
0.4321 407 0.0599
0.4331 408 0.0476
0.4342 409 0.0462
0.4352 410 0.0502
0.4363 411 0.0448
0.4374 412 0.0461
0.4384 413 0.035
0.4395 414 0.0451
0.4406 415 0.0456
0.4416 416 0.0399
0.4427 417 0.0602
0.4437 418 0.0588
0.4448 419 0.0675
0.4459 420 0.0628
0.4469 421 0.0498
0.4480 422 0.0413
0.4490 423 0.0437
0.4501 424 0.0514
0.4512 425 0.0586
0.4522 426 0.0596
0.4533 427 0.0368
0.4544 428 0.0448
0.4554 429 0.056
0.4565 430 0.0415
0.4575 431 0.0448
0.4586 432 0.055
0.4597 433 0.0442
0.4607 434 0.0462
0.4618 435 0.0479
0.4628 436 0.0507
0.4639 437 0.049
0.4650 438 0.0626
0.4660 439 0.0375
0.4671 440 0.0541
0.4682 441 0.0579
0.4692 442 0.0642
0.4703 443 0.0471
0.4713 444 0.0559
0.4724 445 0.0508
0.4735 446 0.0696
0.4745 447 0.056
0.4756 448 0.0649
0.4766 449 0.0641
0.4777 450 0.0547
0.4788 451 0.0509
0.4798 452 0.0544
0.4809 453 0.0487
0.4820 454 0.0639
0.4830 455 0.047
0.4841 456 0.0513
0.4851 457 0.0451
0.4862 458 0.0567
0.4873 459 0.0541
0.4883 460 0.0475
0.4894 461 0.0445
0.4904 462 0.0597
0.4915 463 0.0434
0.4926 464 0.0468
0.4936 465 0.0449
0.4947 466 0.0422
0.4958 467 0.0504
0.4968 468 0.0565
0.4979 469 0.0611
0.4989 470 0.044
0.5 471 0.0543
0.5011 472 0.0424
0.5021 473 0.0443
0.5032 474 0.0367
0.5042 475 0.0427
0.5053 476 0.0431
0.5064 477 0.063
0.5074 478 0.0421
0.5085 479 0.0367
0.5096 480 0.0456
0.5106 481 0.0586
0.5117 482 0.0747
0.5127 483 0.05
0.5138 484 0.0509
0.5149 485 0.054
0.5159 486 0.0531
0.5170 487 0.0458
0.5180 488 0.0522
0.5191 489 0.0406
0.5202 490 0.0529
0.5212 491 0.0602
0.5223 492 0.0469
0.5234 493 0.0602
0.5244 494 0.0506
0.5255 495 0.0522
0.5265 496 0.0433
0.5276 497 0.0531
0.5287 498 0.0453
0.5297 499 0.0416
0.5308 500 0.0366
0.5318 501 0.0483
0.5329 502 0.0453
0.5340 503 0.0495
0.5350 504 0.0522
0.5361 505 0.0476
0.5372 506 0.0416
0.5382 507 0.0497
0.5393 508 0.0431
0.5403 509 0.0494
0.5414 510 0.041
0.5425 511 0.0412
0.5435 512 0.0399
0.5446 513 0.0478
0.5456 514 0.061
0.5467 515 0.0353
0.5478 516 0.0469
0.5488 517 0.0517
0.5499 518 0.0523
0.5510 519 0.058
0.5520 520 0.0432
0.5531 521 0.0442
0.5541 522 0.0551
0.5552 523 0.0488
0.5563 524 0.0482
0.5573 525 0.0474
0.5584 526 0.0577
0.5594 527 0.0375
0.5605 528 0.0401
0.5616 529 0.0574
0.5626 530 0.0496
0.5637 531 0.0422
0.5648 532 0.047
0.5658 533 0.0455
0.5669 534 0.0405
0.5679 535 0.0391
0.5690 536 0.0495
0.5701 537 0.0464
0.5711 538 0.0457
0.5722 539 0.0449
0.5732 540 0.0583
0.5743 541 0.0591
0.5754 542 0.0487
0.5764 543 0.0456
0.5775 544 0.0423
0.5786 545 0.0571
0.5796 546 0.0472
0.5807 547 0.0556
0.5817 548 0.0483
0.5828 549 0.0424
0.5839 550 0.0557
0.5849 551 0.038
0.5860 552 0.0394
0.5870 553 0.0481
0.5881 554 0.0617
0.5892 555 0.0455
0.5902 556 0.0411
0.5913 557 0.0433
0.5924 558 0.0456
0.5934 559 0.0488
0.5945 560 0.0517
0.5955 561 0.0549
0.5966 562 0.0406
0.5977 563 0.045
0.5987 564 0.049
0.5998 565 0.0547
0.6008 566 0.0529
0.6019 567 0.0524
0.6030 568 0.0472
0.6040 569 0.039
0.6051 570 0.041
0.6062 571 0.0508
0.6072 572 0.0486
0.6083 573 0.0375
0.6093 574 0.0585
0.6104 575 0.05
0.6115 576 0.0509
0.6125 577 0.0394
0.6136 578 0.0467
0.6146 579 0.0371
0.6157 580 0.0415
0.6168 581 0.046
0.6178 582 0.0385
0.6189 583 0.056
0.6200 584 0.0416
0.6210 585 0.0578
0.6221 586 0.0443
0.6231 587 0.0407
0.6242 588 0.0499
0.6253 589 0.056
0.6263 590 0.0456
0.6274 591 0.0412
0.6285 592 0.0473
0.6295 593 0.0378
0.6306 594 0.0544
0.6316 595 0.0502
0.6327 596 0.042
0.6338 597 0.0414
0.6348 598 0.0506
0.6359 599 0.0372
0.6369 600 0.0411
0.6380 601 0.0387
0.6391 602 0.0588
0.6401 603 0.0404
0.6412 604 0.056
0.6423 605 0.0524
0.6433 606 0.0484
0.6444 607 0.0398
0.6454 608 0.0523
0.6465 609 0.0469
0.6476 610 0.0504
0.6486 611 0.0496
0.6497 612 0.0501
0.6507 613 0.0426
0.6518 614 0.0454
0.6529 615 0.0564
0.6539 616 0.0798
0.6550 617 0.0444
0.6561 618 0.039
0.6571 619 0.0428
0.6582 620 0.0504
0.6592 621 0.0525
0.6603 622 0.0471
0.6614 623 0.0402
0.6624 624 0.0456
0.6635 625 0.0384
0.6645 626 0.0446
0.6656 627 0.0468
0.6667 628 0.047
0.6677 629 0.0442
0.6688 630 0.0466
0.6699 631 0.0457
0.6709 632 0.0538
0.6720 633 0.0434
0.6730 634 0.0443
0.6741 635 0.0481
0.6752 636 0.0483
0.6762 637 0.0434
0.6773 638 0.0389
0.6783 639 0.0541
0.6794 640 0.0453
0.6805 641 0.0508
0.6815 642 0.0469
0.6826 643 0.0431
0.6837 644 0.0446
0.6847 645 0.0427
0.6858 646 0.0543
0.6868 647 0.0458
0.6879 648 0.046
0.6890 649 0.0669
0.6900 650 0.046
0.6911 651 0.0462
0.6921 652 0.0493
0.6932 653 0.0484
0.6943 654 0.0466
0.6953 655 0.048
0.6964 656 0.0406
0.6975 657 0.0512
0.6985 658 0.0469
0.6996 659 0.0461
0.7006 660 0.039
0.7017 661 0.0403
0.7028 662 0.0419
0.7038 663 0.0538
0.7049 664 0.0364
0.7059 665 0.039
0.7070 666 0.0417
0.7081 667 0.0478
0.7091 668 0.0443
0.7102 669 0.0394
0.7113 670 0.0417
0.7123 671 0.0412
0.7134 672 0.0493
0.7144 673 0.0532
0.7155 674 0.0371
0.7166 675 0.0344
0.7176 676 0.0421
0.7187 677 0.0489
0.7197 678 0.0362
0.7208 679 0.0539
0.7219 680 0.0404
0.7229 681 0.0607
0.7240 682 0.0456
0.7251 683 0.0507
0.7261 684 0.0415
0.7272 685 0.0361
0.7282 686 0.053
0.7293 687 0.0431
0.7304 688 0.0463
0.7314 689 0.0401
0.7325 690 0.0549
0.7335 691 0.0335
0.7346 692 0.05
0.7357 693 0.0472
0.7367 694 0.0474
0.7378 695 0.0556
0.7389 696 0.0456
0.7399 697 0.0481
0.7410 698 0.0388
0.7420 699 0.0381
0.7431 700 0.0491
0.7442 701 0.0436
0.7452 702 0.0522
0.7463 703 0.0471
0.7473 704 0.0367
0.7484 705 0.0393
0.7495 706 0.0418
0.7505 707 0.0371
0.7516 708 0.0315
0.7527 709 0.0508
0.7537 710 0.0535
0.7548 711 0.0453
0.7558 712 0.0352
0.7569 713 0.0507
0.7580 714 0.046
0.7590 715 0.0393
0.7601 716 0.0453
0.7611 717 0.0403
0.7622 718 0.0346
0.7633 719 0.0492
0.7643 720 0.0437
0.7654 721 0.042
0.7665 722 0.052
0.7675 723 0.043
0.7686 724 0.0524
0.7696 725 0.0385
0.7707 726 0.0484
0.7718 727 0.0454
0.7728 728 0.0478
0.7739 729 0.0411
0.7749 730 0.0415
0.7760 731 0.0323
0.7771 732 0.0492
0.7781 733 0.0429
0.7792 734 0.0445
0.7803 735 0.0484
0.7813 736 0.042
0.7824 737 0.0486
0.7834 738 0.0349
0.7845 739 0.0472
0.7856 740 0.0413
0.7866 741 0.0476
0.7877 742 0.0519
0.7887 743 0.0405
0.7898 744 0.0439
0.7909 745 0.035
0.7919 746 0.0478
0.7930 747 0.0476
0.7941 748 0.0382
0.7951 749 0.0568
0.7962 750 0.0505
0.7972 751 0.0572
0.7983 752 0.0352
0.7994 753 0.0405
0.8004 754 0.0505
0.8015 755 0.0478
0.8025 756 0.0465
0.8036 757 0.0493
0.8047 758 0.0414
0.8057 759 0.0438
0.8068 760 0.0559
0.8079 761 0.044
0.8089 762 0.0385
0.8100 763 0.0414
0.8110 764 0.0516
0.8121 765 0.0475
0.8132 766 0.0394
0.8142 767 0.0566
0.8153 768 0.0385
0.8163 769 0.0405
0.8174 770 0.0392
0.8185 771 0.0364
0.8195 772 0.0501
0.8206 773 0.0462
0.8217 774 0.0436
0.8227 775 0.0548
0.8238 776 0.0429
0.8248 777 0.0416
0.8259 778 0.043
0.8270 779 0.0481
0.8280 780 0.0382
0.8291 781 0.0439
0.8301 782 0.0369
0.8312 783 0.0377
0.8323 784 0.0463
0.8333 785 0.0372
0.8344 786 0.0563
0.8355 787 0.0447
0.8365 788 0.0366
0.8376 789 0.0466
0.8386 790 0.049
0.8397 791 0.0557
0.8408 792 0.0495
0.8418 793 0.0359
0.8429 794 0.0519
0.8439 795 0.0538
0.8450 796 0.0388
0.8461 797 0.0431
0.8471 798 0.0513
0.8482 799 0.047
0.8493 800 0.0485
0.8503 801 0.052
0.8514 802 0.032
0.8524 803 0.0419
0.8535 804 0.0439
0.8546 805 0.0548
0.8556 806 0.0433
0.8567 807 0.0407
0.8577 808 0.0467
0.8588 809 0.0494
0.8599 810 0.0516
0.8609 811 0.0418
0.8620 812 0.0344
0.8631 813 0.0505
0.8641 814 0.0477
0.8652 815 0.0533
0.8662 816 0.0431
0.8673 817 0.0439
0.8684 818 0.0321
0.8694 819 0.0418
0.8705 820 0.043
0.8715 821 0.035
0.8726 822 0.0473
0.8737 823 0.0294
0.8747 824 0.0573
0.8758 825 0.038
0.8769 826 0.04
0.8779 827 0.0406
0.8790 828 0.0413
0.8800 829 0.0416
0.8811 830 0.0344
0.8822 831 0.0511
0.8832 832 0.0403
0.8843 833 0.0613
0.8854 834 0.0384
0.8864 835 0.0363
0.8875 836 0.0324
0.8885 837 0.0472
0.8896 838 0.049
0.8907 839 0.0465
0.8917 840 0.0419
0.8928 841 0.0455
0.8938 842 0.0481
0.8949 843 0.0463
0.8960 844 0.0352
0.8970 845 0.0527
0.8981 846 0.0561
0.8992 847 0.0381
0.9002 848 0.0434
0.9013 849 0.0436
0.9023 850 0.0462
0.9034 851 0.0503
0.9045 852 0.0479
0.9055 853 0.0451
0.9066 854 0.0459
0.9076 855 0.0508
0.9087 856 0.0453
0.9098 857 0.0444
0.9108 858 0.0461
0.9119 859 0.056
0.9130 860 0.0449
0.9140 861 0.0477
0.9151 862 0.0422
0.9161 863 0.0481
0.9172 864 0.0508
0.9183 865 0.037
0.9193 866 0.0491
0.9204 867 0.0627
0.9214 868 0.0432
0.9225 869 0.0377
0.9236 870 0.0448
0.9246 871 0.0366
0.9257 872 0.0406
0.9268 873 0.0445
0.9278 874 0.0424
0.9289 875 0.0322
0.9299 876 0.0441
0.9310 877 0.0498
0.9321 878 0.0418
0.9331 879 0.0524
0.9342 880 0.06
0.9352 881 0.0428
0.9363 882 0.0428
0.9374 883 0.0509
0.9384 884 0.0428
0.9395 885 0.0295
0.9406 886 0.0535
0.9416 887 0.04
0.9427 888 0.0425
0.9437 889 0.0583
0.9448 890 0.0374
0.9459 891 0.0489
0.9469 892 0.0472
0.9480 893 0.0449
0.9490 894 0.0342
0.9501 895 0.0604
0.9512 896 0.047
0.9522 897 0.0433
0.9533 898 0.0355
0.9544 899 0.0419
0.9554 900 0.044
0.9565 901 0.0457
0.9575 902 0.0377
0.9586 903 0.0416
0.9597 904 0.0505
0.9607 905 0.0487
0.9618 906 0.0473
0.9628 907 0.0521
0.9639 908 0.0336
0.9650 909 0.0446
0.9660 910 0.0423
0.9671 911 0.0442
0.9682 912 0.0505
0.9692 913 0.0488
0.9703 914 0.0367
0.9713 915 0.0382
0.9724 916 0.0487
0.9735 917 0.061
0.9745 918 0.0461
0.9756 919 0.0377
0.9766 920 0.0398
0.9777 921 0.0363
0.9788 922 0.0375
0.9798 923 0.0503
0.9809 924 0.0493
0.9820 925 0.04
0.9830 926 0.0379
0.9841 927 0.0422
0.9851 928 0.0517
0.9862 929 0.0488
0.9873 930 0.057
0.9883 931 0.0388
0.9894 932 0.0374
0.9904 933 0.0374
0.9915 934 0.0504
0.9926 935 0.056
0.9936 936 0.0478
0.9947 937 0.0286
0.9958 938 0.0415
0.9968 939 0.037
0.9979 940 0.0445
0.9989 941 0.0451
1.0 942 0.036
1.0011 943 0.0346
1.0021 944 0.044
1.0032 945 0.044
1.0042 946 0.0487
1.0053 947 0.0411
1.0064 948 0.0385
1.0074 949 0.0414
1.0085 950 0.0369
1.0096 951 0.0381
1.0106 952 0.0358
1.0117 953 0.0455
1.0127 954 0.0414
1.0138 955 0.0327
1.0149 956 0.0492
1.0159 957 0.0552
1.0170 958 0.0399
1.0180 959 0.0442
1.0191 960 0.0398
1.0202 961 0.0418
1.0212 962 0.037
1.0223 963 0.0433
1.0234 964 0.0405
1.0244 965 0.0429
1.0255 966 0.0364
1.0265 967 0.0424
1.0276 968 0.0419
1.0287 969 0.044
1.0297 970 0.0326
1.0308 971 0.0391
1.0318 972 0.0436
1.0329 973 0.0466
1.0340 974 0.0357
1.0350 975 0.0562
1.0361 976 0.0328
1.0372 977 0.0423
1.0382 978 0.0316
1.0393 979 0.0488
1.0403 980 0.0352
1.0414 981 0.0383
1.0425 982 0.0544
1.0435 983 0.0336
1.0446 984 0.0426
1.0456 985 0.0301
1.0467 986 0.048
1.0478 987 0.0398
1.0488 988 0.048
1.0499 989 0.0451
1.0510 990 0.0477
1.0520 991 0.0437
1.0531 992 0.0367
1.0541 993 0.0438
1.0552 994 0.0482
1.0563 995 0.0445
1.0573 996 0.0499
1.0584 997 0.0409
1.0594 998 0.0426
1.0605 999 0.0417
1.0616 1000 0.0498

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.2.1
  • Transformers: 4.44.2
  • PyTorch: 2.3.1+cu121
  • Accelerate: 1.1.1
  • Datasets: 2.21.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
7
Safetensors
Model size
305M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for seongil-dn/gte-noneg-bs512-lr5e-5-1000

Finetuned
(21)
this model