SentenceTransformer based on cambridgeltl/SapBERT-from-PubMedBERT-fulltext
This is a sentence-transformers model finetuned from cambridgeltl/SapBERT-from-PubMedBERT-fulltext. It maps sentences & paragraphs to a 1536-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: cambridgeltl/SapBERT-from-PubMedBERT-fulltext
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 1536 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the ๐ค Hub
model = SentenceTransformer("yyzheng00/sapbert_lora_triplet_rank16")
# Run inference
sentences = [
'|Product containing tripotassium dicitratobismuthate (medicinal product)| + |Product manufactured as oral dose form (product)| : |Has manufactured dose form (attribute)| = |Oral dose form (dose form)|, { |Has active ingredient (attribute)| = |Tripotassium dicitratobismuthate (substance)| }',
'Tripotassium dicitratobismuthate in oral dosage form (medicinal product form)',
'Product containing piracetam in oral dose form (medicinal product form)',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1536]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 800,000 training samples
- Columns:
sentence_0
,sentence_1
, andsentence_2
- Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 sentence_2 type string string string details - min: 6 tokens
- mean: 46.56 tokens
- max: 304 tokens
- min: 6 tokens
- mean: 12.3 tokens
- max: 33 tokens
- min: 6 tokens
- mean: 20.93 tokens
- max: 177 tokens
- Samples:
sentence_0 sentence_1 sentence_2 Product containing precisely perindopril erbumine 8 milligram/1 each conventional release oral tablet (clinical drug)
Product containing only perindopril erbumine 8 mg/1 each oral tablet (clinical drug)
Avascular necrosis of bone of pelvis caused by drug (disorder)
Product containing clemastine (medicinal product) + Internal hemorrhoids (disorder) + - Loss:
TripletLoss
with these parameters:{ "distance_metric": "TripletDistanceMetric.COSINE", "triplet_margin": 0.2 }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsnum_train_epochs
: 2fp16
: Truemulti_dataset_batch_sampler
: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 8per_device_eval_batch_size
: 8per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1num_train_epochs
: 2max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: round_robin
Training Logs
Click to expand
Epoch | Step | Training Loss |
---|---|---|
0.005 | 500 | 0.0318 |
0.01 | 1000 | 0.0218 |
0.015 | 1500 | 0.019 |
0.02 | 2000 | 0.0194 |
0.025 | 2500 | 0.0184 |
0.03 | 3000 | 0.0157 |
0.035 | 3500 | 0.017 |
0.04 | 4000 | 0.0164 |
0.045 | 4500 | 0.0148 |
0.05 | 5000 | 0.016 |
0.055 | 5500 | 0.0174 |
0.06 | 6000 | 0.018 |
0.065 | 6500 | 0.0166 |
0.07 | 7000 | 0.0147 |
0.075 | 7500 | 0.0158 |
0.08 | 8000 | 0.0155 |
0.085 | 8500 | 0.0145 |
0.09 | 9000 | 0.0153 |
0.095 | 9500 | 0.0151 |
0.1 | 10000 | 0.0142 |
0.105 | 10500 | 0.0158 |
0.11 | 11000 | 0.0153 |
0.115 | 11500 | 0.0151 |
0.12 | 12000 | 0.0157 |
0.125 | 12500 | 0.015 |
0.13 | 13000 | 0.016 |
0.135 | 13500 | 0.0156 |
0.14 | 14000 | 0.0144 |
0.145 | 14500 | 0.0127 |
0.15 | 15000 | 0.0148 |
0.155 | 15500 | 0.0135 |
0.16 | 16000 | 0.0133 |
0.165 | 16500 | 0.0149 |
0.17 | 17000 | 0.0147 |
0.175 | 17500 | 0.0155 |
0.18 | 18000 | 0.0147 |
0.185 | 18500 | 0.0148 |
0.19 | 19000 | 0.0138 |
0.195 | 19500 | 0.0158 |
0.2 | 20000 | 0.0134 |
0.205 | 20500 | 0.0154 |
0.21 | 21000 | 0.0144 |
0.215 | 21500 | 0.0161 |
0.22 | 22000 | 0.0156 |
0.225 | 22500 | 0.0144 |
0.23 | 23000 | 0.0147 |
0.235 | 23500 | 0.017 |
0.24 | 24000 | 0.0128 |
0.245 | 24500 | 0.0145 |
0.25 | 25000 | 0.0158 |
0.255 | 25500 | 0.0149 |
0.26 | 26000 | 0.0147 |
0.265 | 26500 | 0.0128 |
0.27 | 27000 | 0.0157 |
0.275 | 27500 | 0.0154 |
0.28 | 28000 | 0.0149 |
0.285 | 28500 | 0.0131 |
0.29 | 29000 | 0.0167 |
0.295 | 29500 | 0.0167 |
0.3 | 30000 | 0.0149 |
0.305 | 30500 | 0.0151 |
0.31 | 31000 | 0.0154 |
0.315 | 31500 | 0.0149 |
0.32 | 32000 | 0.0161 |
0.325 | 32500 | 0.0146 |
0.33 | 33000 | 0.0158 |
0.335 | 33500 | 0.0151 |
0.34 | 34000 | 0.014 |
0.345 | 34500 | 0.0143 |
0.35 | 35000 | 0.0145 |
0.355 | 35500 | 0.0142 |
0.36 | 36000 | 0.0152 |
0.365 | 36500 | 0.0145 |
0.37 | 37000 | 0.0131 |
0.375 | 37500 | 0.014 |
0.38 | 38000 | 0.0149 |
0.385 | 38500 | 0.0131 |
0.39 | 39000 | 0.0152 |
0.395 | 39500 | 0.0149 |
0.4 | 40000 | 0.0143 |
0.405 | 40500 | 0.0145 |
0.41 | 41000 | 0.0136 |
0.415 | 41500 | 0.0138 |
0.42 | 42000 | 0.0138 |
0.425 | 42500 | 0.013 |
0.43 | 43000 | 0.0151 |
0.435 | 43500 | 0.014 |
0.44 | 44000 | 0.0147 |
0.445 | 44500 | 0.0136 |
0.45 | 45000 | 0.0135 |
0.455 | 45500 | 0.0145 |
0.46 | 46000 | 0.015 |
0.465 | 46500 | 0.0134 |
0.47 | 47000 | 0.0154 |
0.475 | 47500 | 0.0125 |
0.48 | 48000 | 0.0151 |
0.485 | 48500 | 0.0146 |
0.49 | 49000 | 0.0155 |
0.495 | 49500 | 0.0137 |
0.5 | 50000 | 0.0154 |
0.505 | 50500 | 0.0151 |
0.51 | 51000 | 0.0147 |
0.515 | 51500 | 0.0148 |
0.52 | 52000 | 0.0159 |
0.525 | 52500 | 0.0137 |
0.53 | 53000 | 0.0137 |
0.535 | 53500 | 0.0145 |
0.54 | 54000 | 0.0134 |
0.545 | 54500 | 0.0137 |
0.55 | 55000 | 0.0143 |
0.555 | 55500 | 0.014 |
0.56 | 56000 | 0.0149 |
0.565 | 56500 | 0.0136 |
0.57 | 57000 | 0.0137 |
0.575 | 57500 | 0.0151 |
0.58 | 58000 | 0.0149 |
0.585 | 58500 | 0.0128 |
0.59 | 59000 | 0.0142 |
0.595 | 59500 | 0.0124 |
0.6 | 60000 | 0.0152 |
0.605 | 60500 | 0.0139 |
0.61 | 61000 | 0.016 |
0.615 | 61500 | 0.0135 |
0.62 | 62000 | 0.0139 |
0.625 | 62500 | 0.0143 |
0.63 | 63000 | 0.0139 |
0.635 | 63500 | 0.0132 |
0.64 | 64000 | 0.0129 |
0.645 | 64500 | 0.012 |
0.65 | 65000 | 0.0132 |
0.655 | 65500 | 0.0144 |
0.66 | 66000 | 0.0135 |
0.665 | 66500 | 0.0141 |
0.67 | 67000 | 0.0126 |
0.675 | 67500 | 0.0134 |
0.68 | 68000 | 0.0129 |
0.685 | 68500 | 0.0152 |
0.69 | 69000 | 0.0135 |
0.695 | 69500 | 0.0135 |
0.7 | 70000 | 0.013 |
0.705 | 70500 | 0.0121 |
0.71 | 71000 | 0.0122 |
0.715 | 71500 | 0.0131 |
0.72 | 72000 | 0.0137 |
0.725 | 72500 | 0.0136 |
0.73 | 73000 | 0.0137 |
0.735 | 73500 | 0.0127 |
0.74 | 74000 | 0.0147 |
0.745 | 74500 | 0.0129 |
0.75 | 75000 | 0.0123 |
0.755 | 75500 | 0.0116 |
0.76 | 76000 | 0.0138 |
0.765 | 76500 | 0.013 |
0.77 | 77000 | 0.0127 |
0.775 | 77500 | 0.0131 |
0.78 | 78000 | 0.0143 |
0.785 | 78500 | 0.0129 |
0.79 | 79000 | 0.0129 |
0.795 | 79500 | 0.0132 |
0.8 | 80000 | 0.0133 |
0.805 | 80500 | 0.014 |
0.81 | 81000 | 0.0124 |
0.815 | 81500 | 0.0147 |
0.82 | 82000 | 0.013 |
0.825 | 82500 | 0.0137 |
0.83 | 83000 | 0.0128 |
0.835 | 83500 | 0.0138 |
0.84 | 84000 | 0.012 |
0.845 | 84500 | 0.0148 |
0.85 | 85000 | 0.0136 |
0.855 | 85500 | 0.0141 |
0.86 | 86000 | 0.0135 |
0.865 | 86500 | 0.0132 |
0.87 | 87000 | 0.0132 |
0.875 | 87500 | 0.0116 |
0.88 | 88000 | 0.0137 |
0.885 | 88500 | 0.0133 |
0.89 | 89000 | 0.0115 |
0.895 | 89500 | 0.0148 |
0.9 | 90000 | 0.0123 |
0.905 | 90500 | 0.0122 |
0.91 | 91000 | 0.0128 |
0.915 | 91500 | 0.0129 |
0.92 | 92000 | 0.0139 |
0.925 | 92500 | 0.014 |
0.93 | 93000 | 0.014 |
0.935 | 93500 | 0.0117 |
0.94 | 94000 | 0.0131 |
0.945 | 94500 | 0.014 |
0.95 | 95000 | 0.0122 |
0.955 | 95500 | 0.0124 |
0.96 | 96000 | 0.0128 |
0.965 | 96500 | 0.0122 |
0.97 | 97000 | 0.0108 |
0.975 | 97500 | 0.0131 |
0.98 | 98000 | 0.013 |
0.985 | 98500 | 0.0125 |
0.99 | 99000 | 0.0131 |
0.995 | 99500 | 0.012 |
1.0 | 100000 | 0.0135 |
1.005 | 100500 | 0.0133 |
1.01 | 101000 | 0.0133 |
1.015 | 101500 | 0.0111 |
1.02 | 102000 | 0.0118 |
1.025 | 102500 | 0.012 |
1.03 | 103000 | 0.0128 |
1.035 | 103500 | 0.0121 |
1.04 | 104000 | 0.0125 |
1.045 | 104500 | 0.0124 |
1.05 | 105000 | 0.0131 |
1.055 | 105500 | 0.0116 |
1.06 | 106000 | 0.0136 |
1.065 | 106500 | 0.0124 |
1.07 | 107000 | 0.0123 |
1.075 | 107500 | 0.0139 |
1.08 | 108000 | 0.0107 |
1.085 | 108500 | 0.012 |
1.09 | 109000 | 0.0125 |
1.095 | 109500 | 0.0125 |
1.1 | 110000 | 0.012 |
1.105 | 110500 | 0.0127 |
1.11 | 111000 | 0.0119 |
1.115 | 111500 | 0.0125 |
1.12 | 112000 | 0.0119 |
1.125 | 112500 | 0.012 |
1.13 | 113000 | 0.0113 |
1.135 | 113500 | 0.0122 |
1.1400 | 114000 | 0.0118 |
1.145 | 114500 | 0.0124 |
1.15 | 115000 | 0.0127 |
1.155 | 115500 | 0.0111 |
1.16 | 116000 | 0.0124 |
1.165 | 116500 | 0.0108 |
1.17 | 117000 | 0.0112 |
1.175 | 117500 | 0.0141 |
1.18 | 118000 | 0.0113 |
1.185 | 118500 | 0.012 |
1.19 | 119000 | 0.0129 |
1.195 | 119500 | 0.0122 |
1.2 | 120000 | 0.012 |
1.205 | 120500 | 0.0124 |
1.21 | 121000 | 0.0115 |
1.215 | 121500 | 0.0106 |
1.22 | 122000 | 0.0098 |
1.225 | 122500 | 0.0098 |
1.23 | 123000 | 0.0114 |
1.2350 | 123500 | 0.0124 |
1.24 | 124000 | 0.0123 |
1.245 | 124500 | 0.0122 |
1.25 | 125000 | 0.0115 |
1.255 | 125500 | 0.0124 |
1.26 | 126000 | 0.0108 |
1.2650 | 126500 | 0.0118 |
1.27 | 127000 | 0.0122 |
1.275 | 127500 | 0.0108 |
1.28 | 128000 | 0.0126 |
1.285 | 128500 | 0.0117 |
1.29 | 129000 | 0.0105 |
1.295 | 129500 | 0.0115 |
1.3 | 130000 | 0.0114 |
1.305 | 130500 | 0.01 |
1.31 | 131000 | 0.0115 |
1.315 | 131500 | 0.0117 |
1.32 | 132000 | 0.0116 |
1.325 | 132500 | 0.0113 |
1.33 | 133000 | 0.0114 |
1.335 | 133500 | 0.0135 |
1.34 | 134000 | 0.0118 |
1.345 | 134500 | 0.0117 |
1.35 | 135000 | 0.0108 |
1.355 | 135500 | 0.0115 |
1.3600 | 136000 | 0.0124 |
1.365 | 136500 | 0.0122 |
1.37 | 137000 | 0.0107 |
1.375 | 137500 | 0.0112 |
1.38 | 138000 | 0.0108 |
1.385 | 138500 | 0.012 |
1.3900 | 139000 | 0.0102 |
1.395 | 139500 | 0.0117 |
1.4 | 140000 | 0.0101 |
1.405 | 140500 | 0.0114 |
1.41 | 141000 | 0.0105 |
1.415 | 141500 | 0.0114 |
1.42 | 142000 | 0.0106 |
1.425 | 142500 | 0.0115 |
1.43 | 143000 | 0.0112 |
1.435 | 143500 | 0.0108 |
1.44 | 144000 | 0.011 |
1.445 | 144500 | 0.0122 |
1.45 | 145000 | 0.0105 |
1.455 | 145500 | 0.0118 |
1.46 | 146000 | 0.0113 |
1.465 | 146500 | 0.0114 |
1.47 | 147000 | 0.0111 |
1.475 | 147500 | 0.0101 |
1.48 | 148000 | 0.0115 |
1.4850 | 148500 | 0.0102 |
1.49 | 149000 | 0.0105 |
1.495 | 149500 | 0.0101 |
1.5 | 150000 | 0.0096 |
1.505 | 150500 | 0.0099 |
1.51 | 151000 | 0.0108 |
1.5150 | 151500 | 0.0104 |
1.52 | 152000 | 0.0101 |
1.525 | 152500 | 0.0117 |
1.53 | 153000 | 0.0112 |
1.5350 | 153500 | 0.0116 |
1.54 | 154000 | 0.0123 |
1.545 | 154500 | 0.0108 |
1.55 | 155000 | 0.0117 |
1.5550 | 155500 | 0.0111 |
1.56 | 156000 | 0.0114 |
1.565 | 156500 | 0.0114 |
1.5700 | 157000 | 0.0108 |
1.575 | 157500 | 0.0109 |
1.58 | 158000 | 0.0106 |
1.585 | 158500 | 0.0106 |
1.5900 | 159000 | 0.0103 |
1.595 | 159500 | 0.0101 |
1.6 | 160000 | 0.0109 |
1.605 | 160500 | 0.0101 |
1.6100 | 161000 | 0.01 |
1.615 | 161500 | 0.0109 |
1.62 | 162000 | 0.0105 |
1.625 | 162500 | 0.0099 |
1.63 | 163000 | 0.0116 |
1.635 | 163500 | 0.0096 |
1.6400 | 164000 | 0.0083 |
1.645 | 164500 | 0.0098 |
1.65 | 165000 | 0.0107 |
1.655 | 165500 | 0.0104 |
1.6600 | 166000 | 0.0105 |
1.665 | 166500 | 0.0115 |
1.67 | 167000 | 0.0109 |
1.675 | 167500 | 0.0109 |
1.6800 | 168000 | 0.0131 |
1.685 | 168500 | 0.0106 |
1.69 | 169000 | 0.0115 |
1.6950 | 169500 | 0.0092 |
1.7 | 170000 | 0.0094 |
1.705 | 170500 | 0.0093 |
1.71 | 171000 | 0.0098 |
1.7150 | 171500 | 0.0115 |
1.72 | 172000 | 0.0103 |
1.725 | 172500 | 0.0098 |
1.73 | 173000 | 0.0095 |
1.7350 | 173500 | 0.009 |
1.74 | 174000 | 0.0101 |
1.745 | 174500 | 0.0099 |
1.75 | 175000 | 0.0088 |
1.755 | 175500 | 0.0096 |
1.76 | 176000 | 0.0105 |
1.7650 | 176500 | 0.0107 |
1.77 | 177000 | 0.0088 |
1.775 | 177500 | 0.0089 |
1.78 | 178000 | 0.0091 |
1.7850 | 178500 | 0.0104 |
1.79 | 179000 | 0.0112 |
1.795 | 179500 | 0.0103 |
1.8 | 180000 | 0.0087 |
1.8050 | 180500 | 0.0098 |
1.81 | 181000 | 0.0097 |
1.815 | 181500 | 0.0108 |
1.8200 | 182000 | 0.0099 |
1.825 | 182500 | 0.0101 |
1.83 | 183000 | 0.0093 |
1.835 | 183500 | 0.0109 |
1.8400 | 184000 | 0.009 |
1.845 | 184500 | 0.0093 |
1.85 | 185000 | 0.0103 |
1.855 | 185500 | 0.0095 |
1.8600 | 186000 | 0.0105 |
1.865 | 186500 | 0.0101 |
1.87 | 187000 | 0.009 |
1.875 | 187500 | 0.0103 |
1.88 | 188000 | 0.0109 |
1.885 | 188500 | 0.0115 |
1.8900 | 189000 | 0.0098 |
1.895 | 189500 | 0.0084 |
1.9 | 190000 | 0.0089 |
1.905 | 190500 | 0.011 |
1.9100 | 191000 | 0.0091 |
1.915 | 191500 | 0.0102 |
1.92 | 192000 | 0.009 |
1.925 | 192500 | 0.0093 |
1.9300 | 193000 | 0.0099 |
1.935 | 193500 | 0.0097 |
1.94 | 194000 | 0.0088 |
1.9450 | 194500 | 0.0089 |
1.95 | 195000 | 0.0098 |
1.955 | 195500 | 0.0099 |
1.96 | 196000 | 0.0094 |
1.9650 | 196500 | 0.0092 |
1.97 | 197000 | 0.0102 |
1.975 | 197500 | 0.0092 |
1.98 | 198000 | 0.0096 |
1.9850 | 198500 | 0.0101 |
1.99 | 199000 | 0.0104 |
1.995 | 199500 | 0.0093 |
2.0 | 200000 | 0.0096 |
Framework Versions
- Python: 3.11.1
- Sentence Transformers: 4.1.0
- Transformers: 4.47.0
- PyTorch: 2.1.1+cu121
- Accelerate: 1.2.0
- Datasets: 2.18.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
TripletLoss
@misc{hermans2017defense,
title={In Defense of the Triplet Loss for Person Re-Identification},
author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
year={2017},
eprint={1703.07737},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support