SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the parquet dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity
  • Training Dataset:
    • parquet

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the ๐Ÿค— Hub
model = SentenceTransformer("yyzheng00/snomed_triplet_800k")
# Run inference
sentences = [
    '|Adverse reaction caused by drug| : { |Causative agent| = |Digestant| }',
    'Adverse reaction caused by digestant (disorder)',
    'Ureteroscopic division of stricture of ureter (procedure)',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Triplet

Metric Value
cosine_accuracy 0.9983

Triplet

Metric Value
cosine_accuracy 0.9984

Training Details

Training Dataset

parquet

  • Dataset: parquet
  • Size: 800,000 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 19 tokens
    • mean: 59.07 tokens
    • max: 256 tokens
    • min: 4 tokens
    • mean: 12.65 tokens
    • max: 44 tokens
    • min: 4 tokens
    • mean: 12.57 tokens
    • max: 51 tokens
  • Samples:
    anchor positive negative
    Product containing lercanidipine +
    Product containing trovafloxacin +
    Product containing carboxylic acid and/or carboxylic acid derivative +
  • Loss: TripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.COSINE",
        "triplet_margin": 0.2
    }
    

Evaluation Dataset

parquet

  • Dataset: parquet
  • Size: 800,000 evaluation samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 19 tokens
    • mean: 58.9 tokens
    • max: 253 tokens
    • min: 4 tokens
    • mean: 12.41 tokens
    • max: 49 tokens
    • min: 3 tokens
    • mean: 12.43 tokens
    • max: 38 tokens
  • Samples:
    anchor positive negative
    Hodgkin lymphoma, nodular lymphocyte predominance +
    Product containing bexarotene +
    Disease caused by Haemogregarinidae : {
  • Loss: TripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.COSINE",
        "triplet_margin": 0.2
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss snomed_triplet_800k_3_4_3-dev_cosine_accuracy
0.0033 100 0.0207 0.0115 0.9820
0.0067 200 0.0121 0.0093 0.9852
0.01 300 0.013 0.0082 0.9867
0.0133 400 0.0086 0.0076 0.9879
0.0167 500 0.0078 0.0072 0.9886
0.02 600 0.0084 0.0068 0.9893
0.0233 700 0.0066 0.0064 0.9901
0.0267 800 0.0077 0.0059 0.9905
0.03 900 0.0061 0.0059 0.9903
0.0333 1000 0.0069 0.0059 0.9906
0.0367 1100 0.0063 0.0055 0.9911
0.04 1200 0.0054 0.0055 0.9913
0.0433 1300 0.0062 0.0054 0.9917
0.0467 1400 0.0055 0.0053 0.9915
0.05 1500 0.0064 0.0051 0.9926
0.0533 1600 0.0062 0.0050 0.9928
0.0567 1700 0.0055 0.0047 0.9932
0.06 1800 0.0048 0.0052 0.9918
0.0633 1900 0.0056 0.0050 0.9929
0.0667 2000 0.006 0.0051 0.9926
0.07 2100 0.0061 0.0046 0.9932
0.0733 2200 0.0075 0.0045 0.9936
0.0767 2300 0.0055 0.0049 0.9923
0.08 2400 0.0043 0.0046 0.9935
0.0833 2500 0.006 0.0049 0.9923
0.0867 2600 0.0052 0.0048 0.9928
0.09 2700 0.006 0.0047 0.9927
0.0933 2800 0.0062 0.0042 0.9938
0.0967 2900 0.0056 0.0043 0.9942
0.1 3000 0.0049 0.0046 0.9929
0.1033 3100 0.0037 0.0043 0.9935
0.1067 3200 0.0061 0.0045 0.9929
0.11 3300 0.0045 0.0043 0.9934
0.1133 3400 0.0047 0.0047 0.9925
0.1167 3500 0.0068 0.0043 0.9938
0.12 3600 0.0054 0.0041 0.9935
0.1233 3700 0.0047 0.0041 0.9935
0.1267 3800 0.0062 0.0041 0.9938
0.13 3900 0.0029 0.0043 0.9935
0.1333 4000 0.0045 0.0039 0.9939
0.1367 4100 0.0046 0.0039 0.9943
0.14 4200 0.0048 0.0046 0.9928
0.1433 4300 0.0045 0.0043 0.9940
0.1467 4400 0.0058 0.0043 0.9935
0.15 4500 0.0045 0.0043 0.9930
0.1533 4600 0.0046 0.0040 0.9940
0.1567 4700 0.0049 0.0039 0.9944
0.16 4800 0.0047 0.0039 0.9940
0.1633 4900 0.0053 0.0047 0.9938
0.1667 5000 0.0042 0.0041 0.9934
0.17 5100 0.004 0.0039 0.9939
0.1733 5200 0.0037 0.0036 0.9944
0.1767 5300 0.005 0.0037 0.9942
0.18 5400 0.005 0.0035 0.9947
0.1833 5500 0.0047 0.0037 0.9944
0.1867 5600 0.0047 0.0035 0.9947
0.19 5700 0.0046 0.0037 0.9942
0.1933 5800 0.005 0.0036 0.9944
0.1967 5900 0.0052 0.0037 0.9945
0.2 6000 0.0044 0.0035 0.9952
0.2033 6100 0.0043 0.0034 0.9952
0.2067 6200 0.0046 0.0035 0.9947
0.21 6300 0.0059 0.0034 0.9948
0.2133 6400 0.0051 0.0035 0.9948
0.2167 6500 0.0032 0.0032 0.9950
0.22 6600 0.0031 0.0032 0.9951
0.2233 6700 0.003 0.0033 0.9951
0.2267 6800 0.0048 0.0034 0.9950
0.23 6900 0.0028 0.0038 0.9940
0.2333 7000 0.0035 0.0032 0.9951
0.2367 7100 0.0032 0.0032 0.9956
0.24 7200 0.0039 0.0032 0.9952
0.2433 7300 0.0046 0.0032 0.9951
0.2467 7400 0.0042 0.0033 0.9949
0.25 7500 0.0041 0.0032 0.9955
0.2533 7600 0.0043 0.0032 0.9954
0.2567 7700 0.0054 0.0031 0.9957
0.26 7800 0.0037 0.0033 0.9952
0.2633 7900 0.0042 0.0035 0.9948
0.2667 8000 0.0038 0.0031 0.9956
0.27 8100 0.0036 0.0031 0.9954
0.2733 8200 0.0037 0.0031 0.9956
0.2767 8300 0.0043 0.0030 0.9952
0.28 8400 0.004 0.0030 0.9957
0.2833 8500 0.0024 0.0030 0.9955
0.2867 8600 0.0032 0.0028 0.9959
0.29 8700 0.0045 0.0029 0.9957
0.2933 8800 0.0038 0.0030 0.9955
0.2967 8900 0.0057 0.0027 0.9960
0.3 9000 0.0043 0.0029 0.9957
0.3033 9100 0.0041 0.0029 0.9960
0.3067 9200 0.0028 0.0027 0.9962
0.31 9300 0.0025 0.0026 0.9962
0.3133 9400 0.0031 0.0029 0.9956
0.3167 9500 0.0041 0.0033 0.9949
0.32 9600 0.003 0.0028 0.9962
0.3233 9700 0.0029 0.0028 0.9960
0.3267 9800 0.0029 0.0029 0.9957
0.33 9900 0.0027 0.0028 0.9960
0.3333 10000 0.0036 0.0028 0.9963
0.3367 10100 0.0039 0.0027 0.9962
0.34 10200 0.0038 0.0029 0.9958
0.3433 10300 0.0047 0.0026 0.9962
0.3467 10400 0.0045 0.0027 0.9962
0.35 10500 0.0023 0.0026 0.9962
0.3533 10600 0.0035 0.0027 0.9963
0.3567 10700 0.0028 0.0028 0.9960
0.36 10800 0.0025 0.0027 0.9963
0.3633 10900 0.0035 0.0029 0.9957
0.3667 11000 0.0028 0.0028 0.9962
0.37 11100 0.0045 0.0027 0.9962
0.3733 11200 0.0032 0.0026 0.9965
0.3767 11300 0.0035 0.0026 0.9962
0.38 11400 0.005 0.0025 0.9965
0.3833 11500 0.0025 0.0025 0.9965
0.3867 11600 0.0034 0.0026 0.9963
0.39 11700 0.0035 0.0026 0.9963
0.3933 11800 0.0024 0.0029 0.9956
0.3967 11900 0.0034 0.0025 0.9965
0.4 12000 0.0036 0.0024 0.9968
0.4033 12100 0.003 0.0025 0.9968
0.4067 12200 0.0029 0.0025 0.9964
0.41 12300 0.0036 0.0025 0.9965
0.4133 12400 0.0016 0.0024 0.9966
0.4167 12500 0.0029 0.0025 0.9965
0.42 12600 0.0037 0.0024 0.9969
0.4233 12700 0.0025 0.0023 0.9968
0.4267 12800 0.0039 0.0022 0.9972
0.43 12900 0.0024 0.0022 0.9972
0.4333 13000 0.0038 0.0023 0.9968
0.4367 13100 0.0034 0.0022 0.9969
0.44 13200 0.0024 0.0023 0.9967
0.4433 13300 0.0026 0.0025 0.9964
0.4467 13400 0.0028 0.0024 0.9966
0.45 13500 0.0036 0.0024 0.9965
0.4533 13600 0.0025 0.0024 0.9965
0.4567 13700 0.0035 0.0024 0.9967
0.46 13800 0.0018 0.0023 0.9966
0.4633 13900 0.0028 0.0023 0.9968
0.4667 14000 0.0033 0.0022 0.9970
0.47 14100 0.0018 0.0022 0.9970
0.4733 14200 0.003 0.0021 0.9971
0.4767 14300 0.0021 0.0021 0.9971
0.48 14400 0.0029 0.0021 0.9971
0.4833 14500 0.0027 0.0023 0.9969
0.4867 14600 0.0023 0.0021 0.9971
0.49 14700 0.0026 0.0021 0.9972
0.4933 14800 0.0019 0.0021 0.9969
0.4967 14900 0.0024 0.0022 0.9968
0.5 15000 0.0025 0.0021 0.9968
0.5033 15100 0.0026 0.0021 0.9970
0.5067 15200 0.0019 0.0022 0.997
0.51 15300 0.0029 0.0023 0.9968
0.5133 15400 0.0026 0.0021 0.9970
0.5167 15500 0.0027 0.0021 0.9969
0.52 15600 0.0022 0.0022 0.9971
0.5233 15700 0.0026 0.0020 0.9973
0.5267 15800 0.0026 0.0021 0.9973
0.53 15900 0.0022 0.0020 0.9974
0.5333 16000 0.0039 0.0020 0.9975
0.5367 16100 0.0017 0.0020 0.9975
0.54 16200 0.0022 0.0020 0.9975
0.5433 16300 0.002 0.0019 0.9974
0.5467 16400 0.0033 0.0019 0.9975
0.55 16500 0.0032 0.0019 0.9974
0.5533 16600 0.0019 0.0020 0.9975
0.5567 16700 0.0027 0.0019 0.9974
0.56 16800 0.0027 0.0019 0.9973
0.5633 16900 0.0023 0.0018 0.9976
0.5667 17000 0.002 0.0018 0.9976
0.57 17100 0.0024 0.0019 0.9975
0.5733 17200 0.0021 0.0020 0.9973
0.5767 17300 0.0038 0.0019 0.9973
0.58 17400 0.0018 0.0018 0.9975
0.5833 17500 0.0031 0.0018 0.9977
0.5867 17600 0.0021 0.0018 0.9976
0.59 17700 0.0023 0.0019 0.9974
0.5933 17800 0.0031 0.0018 0.9975
0.5967 17900 0.002 0.0019 0.9975
0.6 18000 0.002 0.0018 0.9975
0.6033 18100 0.003 0.0019 0.9976
0.6067 18200 0.0023 0.0018 0.9977
0.61 18300 0.0029 0.0019 0.9975
0.6133 18400 0.0023 0.0018 0.9977
0.6167 18500 0.0017 0.0018 0.9977
0.62 18600 0.0022 0.0018 0.9977
0.6233 18700 0.0023 0.0018 0.9977
0.6267 18800 0.0021 0.0017 0.9978
0.63 18900 0.002 0.0017 0.9978
0.6333 19000 0.0028 0.0018 0.9978
0.6367 19100 0.0024 0.0017 0.9978
0.64 19200 0.0029 0.0017 0.9977
0.6433 19300 0.003 0.0017 0.9979
0.6467 19400 0.0027 0.0017 0.9978
0.65 19500 0.0032 0.0017 0.9978
0.6533 19600 0.0025 0.0017 0.9978
0.6567 19700 0.002 0.0017 0.9978
0.66 19800 0.0018 0.0017 0.9978
0.6633 19900 0.002 0.0018 0.9975
0.6667 20000 0.0029 0.0017 0.9978
0.67 20100 0.0012 0.0017 0.9978
0.6733 20200 0.0024 0.0016 0.9980
0.6767 20300 0.0027 0.0016 0.9980
0.68 20400 0.0023 0.0016 0.9980
0.6833 20500 0.0025 0.0016 0.9981
0.6867 20600 0.0018 0.0016 0.9981
0.69 20700 0.0015 0.0016 0.9981
0.6933 20800 0.0017 0.0015 0.9980
0.6967 20900 0.0027 0.0015 0.9981
0.7 21000 0.0016 0.0015 0.9982
0.7033 21100 0.002 0.0015 0.9983
0.7067 21200 0.002 0.0015 0.9983
0.71 21300 0.0022 0.0015 0.9984
0.7133 21400 0.0019 0.0015 0.9983
0.7167 21500 0.0025 0.0015 0.9981
0.72 21600 0.0025 0.0015 0.9981
0.7233 21700 0.0021 0.0015 0.9982
0.7267 21800 0.002 0.0015 0.9981
0.73 21900 0.0025 0.0015 0.9982
0.7333 22000 0.0021 0.0015 0.9982
0.7367 22100 0.0017 0.0015 0.9983
0.74 22200 0.0021 0.0015 0.9982
0.7433 22300 0.0026 0.0015 0.9981
0.7467 22400 0.0016 0.0015 0.9981
0.75 22500 0.0021 0.0014 0.9981
0.7533 22600 0.002 0.0015 0.9981
0.7567 22700 0.002 0.0014 0.9981
0.76 22800 0.0025 0.0014 0.9982
0.7633 22900 0.0022 0.0015 0.998
0.7667 23000 0.0022 0.0014 0.9981
0.77 23100 0.0017 0.0014 0.9982
0.7733 23200 0.0024 0.0014 0.9983
0.7767 23300 0.0021 0.0014 0.9981
0.78 23400 0.0018 0.0014 0.9982
0.7833 23500 0.0025 0.0014 0.9981
0.7867 23600 0.0025 0.0014 0.9981
0.79 23700 0.0015 0.0014 0.9981
0.7933 23800 0.0023 0.0014 0.9982
0.7967 23900 0.0028 0.0014 0.9981
0.8 24000 0.0022 0.0014 0.9981
0.8033 24100 0.0019 0.0014 0.9983
0.8067 24200 0.0021 0.0014 0.9982
0.81 24300 0.002 0.0013 0.9982
0.8133 24400 0.0015 0.0013 0.9982
0.8167 24500 0.0021 0.0013 0.9984
0.82 24600 0.0016 0.0013 0.9983
0.8233 24700 0.0016 0.0013 0.9983
0.8267 24800 0.0016 0.0014 0.9982
0.83 24900 0.0016 0.0013 0.9983
0.8333 25000 0.0012 0.0013 0.9982
0.8367 25100 0.0019 0.0013 0.9983
0.84 25200 0.0014 0.0013 0.9983
0.8433 25300 0.0024 0.0013 0.9983
0.8467 25400 0.0014 0.0013 0.9983
0.85 25500 0.0013 0.0013 0.9983
0.8533 25600 0.0017 0.0014 0.9983
0.8567 25700 0.0019 0.0014 0.9981
0.86 25800 0.003 0.0013 0.9983
0.8633 25900 0.0012 0.0013 0.9983
0.8667 26000 0.0023 0.0013 0.9983
0.87 26100 0.0017 0.0013 0.9983
0.8733 26200 0.0017 0.0013 0.9982
0.8767 26300 0.002 0.0013 0.9983
0.88 26400 0.0017 0.0013 0.9983
0.8833 26500 0.0017 0.0013 0.9982
0.8867 26600 0.0017 0.0013 0.9983
0.89 26700 0.0005 0.0013 0.9984
0.8933 26800 0.0014 0.0013 0.9983
0.8967 26900 0.0018 0.0013 0.9983
0.9 27000 0.0011 0.0013 0.9983
0.9033 27100 0.0012 0.0013 0.9983
0.9067 27200 0.0012 0.0013 0.9983
0.91 27300 0.0015 0.0013 0.9982
0.9133 27400 0.0015 0.0013 0.9983
0.9167 27500 0.0016 0.0013 0.9983
0.92 27600 0.0015 0.0012 0.9984
0.9233 27700 0.0015 0.0013 0.9984
0.9267 27800 0.0013 0.0012 0.9984
0.93 27900 0.0021 0.0012 0.9984
0.9333 28000 0.0008 0.0013 0.9984
0.9367 28100 0.002 0.0013 0.9983
0.94 28200 0.0024 0.0012 0.9984
0.9433 28300 0.0018 0.0012 0.9984
0.9467 28400 0.001 0.0012 0.9984
0.95 28500 0.001 0.0012 0.9983
0.9533 28600 0.002 0.0012 0.9984
0.9567 28700 0.0019 0.0012 0.9984
0.96 28800 0.0012 0.0012 0.9984
0.9633 28900 0.0017 0.0012 0.9984
0.9667 29000 0.0018 0.0012 0.9984
0.97 29100 0.0015 0.0012 0.9984
0.9733 29200 0.0012 0.0012 0.9983
0.9767 29300 0.0021 0.0012 0.9984
0.98 29400 0.0015 0.0012 0.9983
0.9833 29500 0.0013 0.0012 0.9983
0.9867 29600 0.0012 0.0012 0.9983
0.99 29700 0.0017 0.0012 0.9983
0.9933 29800 0.0016 0.0012 0.9983
0.9967 29900 0.0011 0.0012 0.9983
1.0 30000 0.0017 0.0012 0.9984

Framework Versions

  • Python: 3.11.1
  • Sentence Transformers: 3.3.1
  • Transformers: 4.47.0
  • PyTorch: 2.1.1+cu121
  • Accelerate: 1.2.0
  • Datasets: 2.18.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

TripletLoss

@misc{hermans2017defense,
    title={In Defense of the Triplet Loss for Person Re-Identification},
    author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
    year={2017},
    eprint={1703.07737},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}
Downloads last month
8
Safetensors
Model size
22.7M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for yyzheng00/snomed_triplet_800k

Finetuned
(451)
this model

Evaluation results