SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the parquet dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity
  • Training Dataset:
    • parquet

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the ๐Ÿค— Hub
model = SentenceTransformer("yyzheng00/snomed_triplet_1M")
# Run inference
sentences = [
    '|Estradiol and/or estradiol derivative (substance)| + |Steroid hormone (substance)| + |Substance with estrogen receptor agonist mechanism of action (substance)| : |Has disposition (attribute)| = |Estrogen receptor agonist (disposition)|, ',
    '17-Beta oestradiol (substance)',
    "Rupture of Descemet's membrane of right eye (disorder)",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Triplet

Metric Value
cosine_accuracy 0.9793

Triplet

Metric Value
cosine_accuracy 0.978

Training Details

Training Dataset

parquet

  • Dataset: parquet
  • Size: 1,000,000 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 7 tokens
    • mean: 50.47 tokens
    • max: 256 tokens
    • min: 6 tokens
    • mean: 14.36 tokens
    • max: 42 tokens
    • min: 6 tokens
    • mean: 22.41 tokens
    • max: 256 tokens
  • Samples:
    anchor positive negative
    Anas versicolor (organism) Silver teal (organism) Cryotherapy of gastric lesion (procedure)
    Vitamin B2 and/or vitamin B2 derivative (substance) :
    Aplasia of distal phalanx of fifth toe (disorder) +
  • Loss: TripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.COSINE",
        "triplet_margin": 0.2
    }
    

Evaluation Dataset

parquet

  • Dataset: parquet
  • Size: 1,000,000 evaluation samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 6 tokens
    • mean: 48.58 tokens
    • max: 256 tokens
    • min: 6 tokens
    • mean: 14.51 tokens
    • max: 45 tokens
    • min: 6 tokens
    • mean: 20.96 tokens
    • max: 256 tokens
  • Samples:
    anchor positive negative
    Genus Roseateles (organism)
    Partial urinary cystectomy (procedure) +
    Product containing integrase strand transfer inhibitor (product) +
  • Loss: TripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.COSINE",
        "triplet_margin": 0.2
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss snomed_triplet_1M_3_4_3-dev_cosine_accuracy
0.0027 100 0.0553 0.0405 0.9199
0.0053 200 0.0412 0.0316 0.9369
0.008 300 0.0277 0.0296 0.9405
0.0107 400 0.0303 0.0282 0.9433
0.0133 500 0.0262 0.0275 0.9450
0.016 600 0.0293 0.0266 0.9466
0.0187 700 0.0301 0.0257 0.9480
0.0213 800 0.0262 0.0249 0.9506
0.024 900 0.0258 0.0240 0.9527
0.0267 1000 0.0286 0.0235 0.9537
0.0293 1100 0.0239 0.0229 0.9547
0.032 1200 0.0211 0.0231 0.9548
0.0347 1300 0.0235 0.0228 0.9555
0.0373 1400 0.0257 0.0225 0.9559
0.04 1500 0.025 0.0217 0.9572
0.0427 1600 0.0216 0.0214 0.9581
0.0453 1700 0.0247 0.0214 0.9580
0.048 1800 0.0229 0.0212 0.9588
0.0507 1900 0.0207 0.0211 0.9585
0.0533 2000 0.0224 0.0214 0.9585
0.056 2100 0.0237 0.0209 0.9587
0.0587 2200 0.0205 0.0205 0.9591
0.0613 2300 0.0218 0.0208 0.9590
0.064 2400 0.0209 0.0204 0.9601
0.0667 2500 0.0225 0.0207 0.9591
0.0693 2600 0.021 0.0206 0.9604
0.072 2700 0.0222 0.0197 0.9622
0.0747 2800 0.0214 0.0198 0.9615
0.0773 2900 0.0204 0.0200 0.9611
0.08 3000 0.026 0.0197 0.9622
0.0827 3100 0.0181 0.0197 0.9617
0.0853 3200 0.023 0.0195 0.9612
0.088 3300 0.0198 0.0195 0.9620
0.0907 3400 0.0205 0.0198 0.9611
0.0933 3500 0.0208 0.0194 0.9622
0.096 3600 0.0205 0.0205 0.9592
0.0987 3700 0.0242 0.0196 0.9619
0.1013 3800 0.0178 0.0191 0.9634
0.104 3900 0.0189 0.0189 0.9629
0.1067 4000 0.0249 0.0188 0.9637
0.1093 4100 0.0201 0.0186 0.9634
0.112 4200 0.0198 0.0185 0.9636
0.1147 4300 0.0208 0.0186 0.9639
0.1173 4400 0.019 0.0185 0.9639
0.12 4500 0.0203 0.0188 0.9638
0.1227 4600 0.0205 0.0191 0.9633
0.1253 4700 0.0183 0.0194 0.9623
0.128 4800 0.022 0.0183 0.9643
0.1307 4900 0.0193 0.0182 0.9649
0.1333 5000 0.0192 0.0178 0.9659
0.136 5100 0.0212 0.0185 0.9650
0.1387 5200 0.0181 0.0183 0.9639
0.1413 5300 0.0189 0.0177 0.9656
0.144 5400 0.0209 0.0179 0.9658
0.1467 5500 0.0216 0.0175 0.9665
0.1493 5600 0.0178 0.0176 0.9665
0.152 5700 0.019 0.0178 0.9658
0.1547 5800 0.0215 0.0180 0.9655
0.1573 5900 0.0194 0.0176 0.9663
0.16 6000 0.0182 0.0181 0.9651
0.1627 6100 0.0186 0.0185 0.9640
0.1653 6200 0.019 0.0178 0.9650
0.168 6300 0.019 0.0172 0.9667
0.1707 6400 0.0186 0.0178 0.9654
0.1733 6500 0.0192 0.0172 0.9669
0.176 6600 0.0185 0.0171 0.9670
0.1787 6700 0.019 0.0169 0.9674
0.1813 6800 0.0183 0.0170 0.9671
0.184 6900 0.0199 0.0168 0.9675
0.1867 7000 0.0186 0.0169 0.9673
0.1893 7100 0.016 0.0169 0.9676
0.192 7200 0.0158 0.0174 0.9663
0.1947 7300 0.0205 0.0169 0.9681
0.1973 7400 0.0189 0.0169 0.9669
0.2 7500 0.0188 0.0170 0.9672
0.2027 7600 0.0193 0.0168 0.9674
0.2053 7700 0.0202 0.0168 0.9673
0.208 7800 0.0184 0.0165 0.9676
0.2107 7900 0.0196 0.0162 0.9687
0.2133 8000 0.0186 0.0161 0.9688
0.216 8100 0.0174 0.0166 0.9670
0.2187 8200 0.0178 0.0166 0.9676
0.2213 8300 0.0187 0.0172 0.9664
0.224 8400 0.0175 0.0162 0.9685
0.2267 8500 0.0165 0.0163 0.9674
0.2293 8600 0.018 0.0164 0.9678
0.232 8700 0.0192 0.0165 0.9680
0.2347 8800 0.0182 0.0164 0.9680
0.2373 8900 0.0191 0.0162 0.9689
0.24 9000 0.0173 0.0161 0.9683
0.2427 9100 0.022 0.0159 0.9685
0.2453 9200 0.0182 0.0161 0.9685
0.248 9300 0.0174 0.0165 0.9684
0.2507 9400 0.0181 0.0168 0.9667
0.2533 9500 0.0159 0.0163 0.9684
0.256 9600 0.0176 0.0162 0.9685
0.2587 9700 0.0155 0.0170 0.9668
0.2613 9800 0.0183 0.0162 0.9679
0.264 9900 0.0183 0.0156 0.9693
0.2667 10000 0.019 0.0156 0.9695
0.2693 10100 0.0167 0.0162 0.9683
0.272 10200 0.0202 0.0156 0.9695
0.2747 10300 0.0174 0.0157 0.9694
0.2773 10400 0.0165 0.0155 0.9694
0.28 10500 0.0176 0.0155 0.9700
0.2827 10600 0.0181 0.0153 0.9699
0.2853 10700 0.0184 0.0154 0.9697
0.288 10800 0.0172 0.0155 0.9692
0.2907 10900 0.0153 0.0156 0.9694
0.2933 11000 0.0169 0.0154 0.9700
0.296 11100 0.0181 0.0153 0.9698
0.2987 11200 0.0164 0.0154 0.9700
0.3013 11300 0.0177 0.0158 0.9691
0.304 11400 0.0154 0.0153 0.9700
0.3067 11500 0.0159 0.0153 0.9700
0.3093 11600 0.0162 0.0152 0.9699
0.312 11700 0.0172 0.0150 0.9710
0.3147 11800 0.0151 0.0153 0.9696
0.3173 11900 0.0157 0.0153 0.9697
0.32 12000 0.0145 0.0150 0.9705
0.3227 12100 0.0184 0.0153 0.9701
0.3253 12200 0.0173 0.0151 0.9706
0.328 12300 0.0158 0.0151 0.971
0.3307 12400 0.0154 0.0154 0.9697
0.3333 12500 0.0126 0.0153 0.9697
0.336 12600 0.0151 0.0150 0.9704
0.3387 12700 0.0152 0.0152 0.9698
0.3413 12800 0.0176 0.0150 0.9707
0.344 12900 0.0172 0.0149 0.9705
0.3467 13000 0.0149 0.0151 0.9704
0.3493 13100 0.0154 0.0151 0.9701
0.352 13200 0.0138 0.0148 0.9705
0.3547 13300 0.0195 0.0149 0.9705
0.3573 13400 0.0162 0.0151 0.9707
0.36 13500 0.0137 0.0150 0.9708
0.3627 13600 0.0153 0.0151 0.9704
0.3653 13700 0.0143 0.0150 0.9705
0.368 13800 0.0161 0.0149 0.9709
0.3707 13900 0.0136 0.0149 0.9712
0.3733 14000 0.0161 0.0150 0.9709
0.376 14100 0.0171 0.0148 0.9718
0.3787 14200 0.0168 0.0147 0.9717
0.3813 14300 0.0159 0.0147 0.9718
0.384 14400 0.0167 0.0145 0.9721
0.3867 14500 0.0158 0.0147 0.9715
0.3893 14600 0.0153 0.0146 0.9713
0.392 14700 0.0131 0.0145 0.9717
0.3947 14800 0.0166 0.0144 0.9722
0.3973 14900 0.0164 0.0142 0.9720
0.4 15000 0.0166 0.0143 0.9720
0.4027 15100 0.0168 0.0143 0.9726
0.4053 15200 0.0145 0.0143 0.9723
0.408 15300 0.0149 0.0144 0.9717
0.4107 15400 0.0152 0.0141 0.9729
0.4133 15500 0.0147 0.0140 0.9734
0.416 15600 0.0141 0.0140 0.9731
0.4187 15700 0.0147 0.0140 0.9731
0.4213 15800 0.0158 0.0139 0.9734
0.424 15900 0.0177 0.0141 0.9728
0.4267 16000 0.0151 0.0137 0.9734
0.4293 16100 0.0148 0.0145 0.9724
0.432 16200 0.0135 0.0144 0.9721
0.4347 16300 0.0167 0.0138 0.9736
0.4373 16400 0.0153 0.0138 0.9739
0.44 16500 0.014 0.0139 0.9731
0.4427 16600 0.0168 0.0139 0.9734
0.4453 16700 0.0125 0.0139 0.9734
0.448 16800 0.0163 0.0139 0.9733
0.4507 16900 0.0179 0.0137 0.9742
0.4533 17000 0.0162 0.0136 0.9738
0.456 17100 0.0148 0.0137 0.9734
0.4587 17200 0.0154 0.0137 0.9737
0.4613 17300 0.0178 0.0139 0.9732
0.464 17400 0.0176 0.0138 0.9731
0.4667 17500 0.012 0.0135 0.9738
0.4693 17600 0.0136 0.0137 0.9731
0.472 17700 0.0156 0.0133 0.9740
0.4747 17800 0.0151 0.0136 0.9738
0.4773 17900 0.0145 0.0135 0.9741
0.48 18000 0.0176 0.0136 0.9735
0.4827 18100 0.0143 0.0133 0.9744
0.4853 18200 0.0144 0.0133 0.9742
0.488 18300 0.0139 0.0135 0.9738
0.4907 18400 0.0134 0.0134 0.9740
0.4933 18500 0.0135 0.0134 0.9738
0.496 18600 0.0144 0.0134 0.9738
0.4987 18700 0.0143 0.0135 0.9744
0.5013 18800 0.0165 0.0133 0.9748
0.504 18900 0.0147 0.0133 0.9742
0.5067 19000 0.0159 0.0133 0.9743
0.5093 19100 0.013 0.0132 0.9746
0.512 19200 0.0145 0.0133 0.9744
0.5147 19300 0.0147 0.0134 0.9743
0.5173 19400 0.0151 0.0131 0.9748
0.52 19500 0.0134 0.0132 0.9742
0.5227 19600 0.0148 0.0135 0.9740
0.5253 19700 0.0142 0.0134 0.9744
0.528 19800 0.0158 0.0132 0.9746
0.5307 19900 0.015 0.0134 0.9748
0.5333 20000 0.0146 0.0132 0.9745
0.536 20100 0.0136 0.0130 0.9752
0.5387 20200 0.0142 0.0131 0.9750
0.5413 20300 0.0137 0.0130 0.9749
0.544 20400 0.0118 0.0132 0.9741
0.5467 20500 0.0129 0.0131 0.9750
0.5493 20600 0.015 0.0131 0.9749
0.552 20700 0.0154 0.0132 0.9743
0.5547 20800 0.0165 0.0132 0.9747
0.5573 20900 0.0158 0.0131 0.9751
0.56 21000 0.014 0.0130 0.9746
0.5627 21100 0.0157 0.0129 0.9755
0.5653 21200 0.014 0.0129 0.9754
0.568 21300 0.0149 0.0129 0.9751
0.5707 21400 0.0114 0.0129 0.9754
0.5733 21500 0.0116 0.0128 0.9755
0.576 21600 0.0114 0.0132 0.9743
0.5787 21700 0.0164 0.0127 0.9759
0.5813 21800 0.0137 0.0127 0.9754
0.584 21900 0.0118 0.0129 0.9745
0.5867 22000 0.0126 0.0129 0.9752
0.5893 22100 0.0153 0.0126 0.9758
0.592 22200 0.0128 0.0126 0.9759
0.5947 22300 0.0161 0.0128 0.9755
0.5973 22400 0.0121 0.0128 0.9754
0.6 22500 0.0144 0.0126 0.9758
0.6027 22600 0.0138 0.0127 0.9754
0.6053 22700 0.0114 0.0125 0.9757
0.608 22800 0.0163 0.0126 0.9755
0.6107 22900 0.0127 0.0125 0.9757
0.6133 23000 0.0139 0.0126 0.9752
0.616 23100 0.015 0.0126 0.9754
0.6187 23200 0.0128 0.0124 0.9759
0.6213 23300 0.0127 0.0126 0.9758
0.624 23400 0.0137 0.0126 0.9755
0.6267 23500 0.0171 0.0125 0.9760
0.6293 23600 0.0154 0.0123 0.9761
0.632 23700 0.0133 0.0125 0.9757
0.6347 23800 0.0147 0.0122 0.9762
0.6373 23900 0.012 0.0123 0.9759
0.64 24000 0.0121 0.0124 0.9762
0.6427 24100 0.0156 0.0122 0.9768
0.6453 24200 0.0135 0.0122 0.9763
0.648 24300 0.0111 0.0123 0.9762
0.6507 24400 0.0131 0.0121 0.9766
0.6533 24500 0.0166 0.0120 0.9766
0.656 24600 0.0145 0.0121 0.9764
0.6587 24700 0.0138 0.0122 0.9763
0.6613 24800 0.0127 0.0120 0.9766
0.664 24900 0.0142 0.0120 0.9767
0.6667 25000 0.0119 0.0122 0.9764
0.6693 25100 0.0157 0.0120 0.9768
0.672 25200 0.0126 0.0119 0.9769
0.6747 25300 0.0113 0.0119 0.9772
0.6773 25400 0.0138 0.0121 0.9767
0.68 25500 0.0135 0.0124 0.9759
0.6827 25600 0.0147 0.0120 0.9765
0.6853 25700 0.0119 0.0120 0.9764
0.688 25800 0.0167 0.0120 0.9765
0.6907 25900 0.0132 0.0120 0.9767
0.6933 26000 0.0144 0.0118 0.9768
0.696 26100 0.0135 0.0118 0.9771
0.6987 26200 0.0156 0.0119 0.9769
0.7013 26300 0.0132 0.0119 0.9769
0.704 26400 0.0139 0.0120 0.9769
0.7067 26500 0.014 0.0118 0.9771
0.7093 26600 0.0133 0.0118 0.9770
0.712 26700 0.0142 0.0118 0.9773
0.7147 26800 0.0113 0.0117 0.977
0.7173 26900 0.0142 0.0117 0.977
0.72 27000 0.0112 0.0117 0.9771
0.7227 27100 0.012 0.0118 0.9768
0.7253 27200 0.0135 0.0117 0.9768
0.728 27300 0.0126 0.0116 0.9769
0.7307 27400 0.0136 0.0117 0.9767
0.7333 27500 0.013 0.0116 0.9770
0.736 27600 0.0131 0.0117 0.9767
0.7387 27700 0.0127 0.0116 0.9772
0.7413 27800 0.0124 0.0116 0.9770
0.744 27900 0.011 0.0116 0.9771
0.7467 28000 0.0159 0.0116 0.9770
0.7493 28100 0.0118 0.0116 0.9770
0.752 28200 0.0146 0.0115 0.9773
0.7547 28300 0.0112 0.0116 0.9772
0.7573 28400 0.0116 0.0115 0.9776
0.76 28500 0.0115 0.0115 0.9775
0.7627 28600 0.0137 0.0115 0.9779
0.7653 28700 0.0106 0.0115 0.9777
0.768 28800 0.011 0.0116 0.9774
0.7707 28900 0.0132 0.0115 0.9774
0.7733 29000 0.0119 0.0114 0.9776
0.776 29100 0.0121 0.0114 0.9779
0.7787 29200 0.0136 0.0113 0.9780
0.7813 29300 0.0114 0.0114 0.9779
0.784 29400 0.0122 0.0115 0.9778
0.7867 29500 0.0117 0.0114 0.9780
0.7893 29600 0.0119 0.0114 0.9778
0.792 29700 0.0145 0.0114 0.9778
0.7947 29800 0.0098 0.0113 0.9779
0.7973 29900 0.015 0.0114 0.9777
0.8 30000 0.0123 0.0113 0.9779
0.8027 30100 0.0111 0.0115 0.9774
0.8053 30200 0.0126 0.0114 0.9778
0.808 30300 0.0131 0.0113 0.9783
0.8107 30400 0.0131 0.0113 0.9784
0.8133 30500 0.0113 0.0113 0.9783
0.816 30600 0.0131 0.0113 0.9783
0.8187 30700 0.0137 0.0113 0.9782
0.8213 30800 0.0119 0.0112 0.9784
0.824 30900 0.0127 0.0113 0.9782
0.8267 31000 0.0114 0.0112 0.9787
0.8293 31100 0.0116 0.0111 0.9784
0.832 31200 0.0117 0.0112 0.9784
0.8347 31300 0.0128 0.0112 0.9782
0.8373 31400 0.0125 0.0112 0.9782
0.84 31500 0.0136 0.0111 0.9787
0.8427 31600 0.0121 0.0111 0.9785
0.8453 31700 0.0137 0.0112 0.9785
0.848 31800 0.0115 0.0111 0.9786
0.8507 31900 0.0111 0.0111 0.9784
0.8533 32000 0.012 0.0111 0.9786
0.856 32100 0.0115 0.0111 0.9787
0.8587 32200 0.0125 0.0111 0.9785
0.8613 32300 0.0111 0.0111 0.9788
0.864 32400 0.0127 0.0111 0.9788
0.8667 32500 0.0126 0.0110 0.9788
0.8693 32600 0.012 0.0111 0.9788
0.872 32700 0.0117 0.0111 0.9787
0.8747 32800 0.0136 0.0110 0.9787
0.8773 32900 0.0118 0.0110 0.9788
0.88 33000 0.015 0.0110 0.9789
0.8827 33100 0.0105 0.0110 0.9788
0.8853 33200 0.0135 0.0110 0.9786
0.888 33300 0.0099 0.0110 0.9790
0.8907 33400 0.013 0.0109 0.9787
0.8933 33500 0.0149 0.0109 0.9788
0.896 33600 0.012 0.0109 0.9789
0.8987 33700 0.01 0.0110 0.9788
0.9013 33800 0.0132 0.0110 0.9788
0.904 33900 0.0138 0.0109 0.9791
0.9067 34000 0.0107 0.0109 0.9789
0.9093 34100 0.0133 0.0109 0.9789
0.912 34200 0.0124 0.0109 0.9788
0.9147 34300 0.0119 0.0109 0.9788
0.9173 34400 0.0101 0.0109 0.9787
0.92 34500 0.0135 0.0109 0.9790
0.9227 34600 0.0116 0.0109 0.9789
0.9253 34700 0.0116 0.0109 0.9791
0.928 34800 0.0082 0.0108 0.9791
0.9307 34900 0.0129 0.0108 0.9791
0.9333 35000 0.0129 0.0108 0.9792
0.936 35100 0.0147 0.0108 0.9791
0.9387 35200 0.0112 0.0108 0.9790
0.9413 35300 0.0108 0.0108 0.9790
0.944 35400 0.0114 0.0108 0.9791
0.9467 35500 0.0096 0.0108 0.9792
0.9493 35600 0.0111 0.0108 0.9790
0.952 35700 0.0131 0.0108 0.9790
0.9547 35800 0.0147 0.0108 0.9792
0.9573 35900 0.0121 0.0108 0.9792
0.96 36000 0.0105 0.0108 0.9791
0.9627 36100 0.0081 0.0108 0.9791
0.9653 36200 0.013 0.0108 0.9791
0.968 36300 0.0121 0.0108 0.9792
0.9707 36400 0.0122 0.0108 0.9792
0.9733 36500 0.0121 0.0108 0.9792
0.976 36600 0.011 0.0108 0.9792
0.9787 36700 0.0109 0.0107 0.9792
0.9813 36800 0.0114 0.0107 0.9792
0.984 36900 0.0113 0.0107 0.9793
0.9867 37000 0.0111 0.0107 0.9794
0.9893 37100 0.0097 0.0107 0.9793
0.992 37200 0.0127 0.0107 0.9793
0.9947 37300 0.0143 0.0107 0.9794
0.9973 37400 0.0103 0.0107 0.9794
1.0 37500 0.014 0.0107 0.9780

Framework Versions

  • Python: 3.11.1
  • Sentence Transformers: 3.3.1
  • Transformers: 4.47.0
  • PyTorch: 2.1.1+cu121
  • Accelerate: 1.2.0
  • Datasets: 2.18.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

TripletLoss

@misc{hermans2017defense,
    title={In Defense of the Triplet Loss for Person Re-Identification},
    author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
    year={2017},
    eprint={1703.07737},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}
Downloads last month
8
Safetensors
Model size
22.7M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for yyzheng00/snomed_triplet_1M

Finetuned
(451)
this model

Evaluation results