SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Devy1/MiniLM-cosqa-16")
# Run inference
sentences = [
    'bottom 5 rows in python',
    'def table_top_abs(self):\n        """Returns the absolute position of table top"""\n        table_height = np.array([0, 0, self.table_full_size[2]])\n        return string_to_array(self.floor.get("pos")) + table_height',
    'def refresh(self, document):\n\t\t""" Load a new copy of a document from the database.  does not\n\t\t\treplace the old one """\n\t\ttry:\n\t\t\told_cache_size = self.cache_size\n\t\t\tself.cache_size = 0\n\t\t\tobj = self.query(type(document)).filter_by(mongo_id=document.mongo_id).one()\n\t\tfinally:\n\t\t\tself.cache_size = old_cache_size\n\t\tself.cache_write(obj)\n\t\treturn obj',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[ 1.0000,  0.5117, -0.0480],
#         [ 0.5117,  1.0000, -0.0416],
#         [-0.0480, -0.0416,  1.0000]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 9,020 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 6 tokens
    • mean: 9.67 tokens
    • max: 21 tokens
    • min: 40 tokens
    • mean: 86.17 tokens
    • max: 256 tokens
  • Samples:
    anchor positive
    1d array in char datatype in python def _convert_to_array(array_like, dtype):
    """
    Convert Matrix attributes which are array-like or buffer to array.
    """
    if isinstance(array_like, bytes):
    return np.frombuffer(array_like, dtype=dtype)
    return np.asarray(array_like, dtype=dtype)
    python condition non none def _not(condition=None, **kwargs):
    """
    Return the opposite of input condition.

    :param condition: condition to process.

    :result: not condition.
    :rtype: bool
    """

    result = True

    if condition is not None:
    result = not run(condition, **kwargs)

    return result
    accessing a column from a matrix in python def get_column(self, X, column):
    """Return a column of the given matrix.

    Args:
    X: numpy.ndarray or pandas.DataFrame.
    column: int or str.

    Returns:
    np.ndarray: Selected column.
    """
    if isinstance(X, pd.DataFrame):
    return X[column].values

    return X[:, column]
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 16
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0018 1 0.2852
0.0035 2 0.3503
0.0053 3 0.0422
0.0071 4 0.1306
0.0089 5 0.1997
0.0106 6 0.0307
0.0124 7 0.1194
0.0142 8 0.114
0.0160 9 0.0128
0.0177 10 0.0523
0.0195 11 0.0228
0.0213 12 0.056
0.0230 13 0.2108
0.0248 14 0.0856
0.0266 15 0.058
0.0284 16 0.0311
0.0301 17 0.015
0.0319 18 0.0098
0.0337 19 0.3063
0.0355 20 0.0124
0.0372 21 0.0151
0.0390 22 0.2788
0.0408 23 0.0228
0.0426 24 0.0634
0.0443 25 0.0482
0.0461 26 0.0647
0.0479 27 0.0045
0.0496 28 0.0315
0.0514 29 0.0316
0.0532 30 0.1654
0.0550 31 0.1931
0.0567 32 0.0157
0.0585 33 0.286
0.0603 34 0.1894
0.0621 35 0.0308
0.0638 36 0.0181
0.0656 37 0.126
0.0674 38 0.0258
0.0691 39 0.0669
0.0709 40 0.0979
0.0727 41 0.1078
0.0745 42 0.3883
0.0762 43 0.0341
0.0780 44 0.0439
0.0798 45 0.0733
0.0816 46 0.399
0.0833 47 0.1246
0.0851 48 0.0095
0.0869 49 0.3253
0.0887 50 0.0405
0.0904 51 0.1117
0.0922 52 0.0389
0.0940 53 0.1124
0.0957 54 0.118
0.0975 55 0.2116
0.0993 56 0.0721
0.1011 57 0.1326
0.1028 58 0.1217
0.1046 59 0.0216
0.1064 60 0.0798
0.1082 61 0.1676
0.1099 62 0.0314
0.1117 63 0.045
0.1135 64 0.0325
0.1152 65 0.0624
0.1170 66 0.0282
0.1188 67 0.0164
0.1206 68 0.0632
0.1223 69 0.1402
0.1241 70 0.0271
0.1259 71 0.0449
0.1277 72 0.0107
0.1294 73 0.0531
0.1312 74 0.0489
0.1330 75 0.1134
0.1348 76 0.0657
0.1365 77 0.0383
0.1383 78 0.0288
0.1401 79 0.0514
0.1418 80 0.0173
0.1436 81 0.1886
0.1454 82 0.0532
0.1472 83 0.0024
0.1489 84 0.0076
0.1507 85 0.1116
0.1525 86 0.0089
0.1543 87 0.4592
0.1560 88 0.0552
0.1578 89 0.0327
0.1596 90 0.1102
0.1613 91 0.0077
0.1631 92 0.0048
0.1649 93 0.003
0.1667 94 0.0042
0.1684 95 0.1047
0.1702 96 0.0212
0.1720 97 0.0287
0.1738 98 0.0233
0.1755 99 0.0319
0.1773 100 0.0206
0.1791 101 0.018
0.1809 102 0.059
0.1826 103 0.172
0.1844 104 0.1555
0.1862 105 0.0479
0.1879 106 0.0336
0.1897 107 0.0889
0.1915 108 0.0094
0.1933 109 0.053
0.1950 110 0.0944
0.1968 111 0.2731
0.1986 112 0.0315
0.2004 113 0.162
0.2021 114 0.1024
0.2039 115 0.013
0.2057 116 0.0998
0.2074 117 0.0035
0.2092 118 0.0035
0.2110 119 0.136
0.2128 120 0.0626
0.2145 121 0.0597
0.2163 122 0.1202
0.2181 123 0.1017
0.2199 124 0.0241
0.2216 125 0.0527
0.2234 126 0.0158
0.2252 127 0.005
0.2270 128 0.3728
0.2287 129 0.0049
0.2305 130 0.0426
0.2323 131 0.1093
0.2340 132 0.0607
0.2358 133 0.0387
0.2376 134 0.0672
0.2394 135 0.0187
0.2411 136 0.1737
0.2429 137 0.042
0.2447 138 0.0934
0.2465 139 0.0135
0.2482 140 0.1649
0.25 141 0.1029
0.2518 142 0.0183
0.2535 143 0.1689
0.2553 144 0.6752
0.2571 145 0.076
0.2589 146 0.0961
0.2606 147 0.127
0.2624 148 0.1866
0.2642 149 0.0652
0.2660 150 0.029
0.2677 151 0.0175
0.2695 152 0.0034
0.2713 153 0.2149
0.2730 154 0.0564
0.2748 155 0.0205
0.2766 156 0.0193
0.2784 157 0.1054
0.2801 158 0.0209
0.2819 159 0.1948
0.2837 160 0.0176
0.2855 161 0.1101
0.2872 162 0.003
0.2890 163 0.0373
0.2908 164 0.1793
0.2926 165 0.0878
0.2943 166 0.0346
0.2961 167 0.0051
0.2979 168 0.2891
0.2996 169 0.2409
0.3014 170 0.0056
0.3032 171 0.0051
0.3050 172 0.1651
0.3067 173 0.0802
0.3085 174 0.1191
0.3103 175 0.0453
0.3121 176 0.0972
0.3138 177 0.0157
0.3156 178 0.0339
0.3174 179 0.0759
0.3191 180 0.196
0.3209 181 0.1043
0.3227 182 0.0603
0.3245 183 0.0163
0.3262 184 0.0115
0.3280 185 0.1027
0.3298 186 0.0726
0.3316 187 0.089
0.3333 188 0.0385
0.3351 189 0.0082
0.3369 190 0.1135
0.3387 191 0.074
0.3404 192 0.1149
0.3422 193 0.1642
0.3440 194 0.0166
0.3457 195 0.0105
0.3475 196 0.0313
0.3493 197 0.1255
0.3511 198 0.0471
0.3528 199 0.067
0.3546 200 0.0227
0.3564 201 0.1239
0.3582 202 0.237
0.3599 203 0.0141
0.3617 204 0.0077
0.3635 205 0.0073
0.3652 206 0.0417
0.3670 207 0.0297
0.3688 208 0.0752
0.3706 209 0.0155
0.3723 210 0.0536
0.3741 211 0.0034
0.3759 212 0.0273
0.3777 213 0.2597
0.3794 214 0.0574
0.3812 215 0.0554
0.3830 216 0.0806
0.3848 217 0.018
0.3865 218 0.215
0.3883 219 0.0527
0.3901 220 0.0025
0.3918 221 0.0459
0.3936 222 0.0074
0.3954 223 0.0603
0.3972 224 0.0092
0.3989 225 0.0832
0.4007 226 0.0144
0.4025 227 0.1483
0.4043 228 0.4177
0.4060 229 0.0061
0.4078 230 0.0034
0.4096 231 0.0917
0.4113 232 0.0039
0.4131 233 0.0369
0.4149 234 0.0619
0.4167 235 0.1598
0.4184 236 0.0699
0.4202 237 0.0641
0.4220 238 0.0162
0.4238 239 0.1175
0.4255 240 0.0043
0.4273 241 0.0171
0.4291 242 0.005
0.4309 243 0.169
0.4326 244 0.0124
0.4344 245 0.1141
0.4362 246 0.0467
0.4379 247 0.0074
0.4397 248 0.2058
0.4415 249 0.0186
0.4433 250 0.0112
0.4450 251 0.2977
0.4468 252 0.0384
0.4486 253 0.1525
0.4504 254 0.2781
0.4521 255 0.1463
0.4539 256 0.1352
0.4557 257 0.0789
0.4574 258 0.013
0.4592 259 0.2722
0.4610 260 0.0701
0.4628 261 0.036
0.4645 262 0.0363
0.4663 263 0.1835
0.4681 264 0.2061
0.4699 265 0.0639
0.4716 266 0.0007
0.4734 267 0.0107
0.4752 268 0.1097
0.4770 269 0.2531
0.4787 270 0.0205
0.4805 271 0.1076
0.4823 272 0.0621
0.4840 273 0.0065
0.4858 274 0.0444
0.4876 275 0.0613
0.4894 276 0.0373
0.4911 277 0.4446
0.4929 278 0.071
0.4947 279 0.0839
0.4965 280 0.2712
0.4982 281 0.3855
0.5 282 0.02
0.5018 283 0.1209
0.5035 284 0.0428
0.5053 285 0.0859
0.5071 286 0.0076
0.5089 287 0.0137
0.5106 288 0.1124
0.5124 289 0.2544
0.5142 290 0.0029
0.5160 291 0.0142
0.5177 292 0.0709
0.5195 293 0.0418
0.5213 294 0.1344
0.5230 295 0.0105
0.5248 296 0.1553
0.5266 297 0.0281
0.5284 298 0.0122
0.5301 299 0.0383
0.5319 300 0.2396
0.5337 301 0.1094
0.5355 302 0.0929
0.5372 303 0.0312
0.5390 304 0.068
0.5408 305 0.0128
0.5426 306 0.127
0.5443 307 0.0414
0.5461 308 0.1497
0.5479 309 0.041
0.5496 310 0.0288
0.5514 311 0.0479
0.5532 312 0.0204
0.5550 313 0.0828
0.5567 314 0.0149
0.5585 315 0.1651
0.5603 316 0.0982
0.5621 317 0.0118
0.5638 318 0.1905
0.5656 319 0.0074
0.5674 320 0.1277
0.5691 321 0.0336
0.5709 322 0.037
0.5727 323 0.0228
0.5745 324 0.5044
0.5762 325 0.2475
0.5780 326 0.0389
0.5798 327 0.0035
0.5816 328 0.0812
0.5833 329 0.1005
0.5851 330 0.3384
0.5869 331 0.0345
0.5887 332 0.0903
0.5904 333 0.0144
0.5922 334 0.0853
0.5940 335 0.1661
0.5957 336 0.0339
0.5975 337 0.0749
0.5993 338 0.2761
0.6011 339 0.0036
0.6028 340 0.0843
0.6046 341 0.0963
0.6064 342 0.0261
0.6082 343 0.0305
0.6099 344 0.0076
0.6117 345 0.006
0.6135 346 0.0034
0.6152 347 0.0278
0.6170 348 0.01
0.6188 349 0.0059
0.6206 350 0.0663
0.6223 351 0.0198
0.6241 352 0.0134
0.6259 353 0.123
0.6277 354 0.0899
0.6294 355 0.0943
0.6312 356 0.011
0.6330 357 0.1238
0.6348 358 0.0283
0.6365 359 0.0248
0.6383 360 0.0365
0.6401 361 0.0349
0.6418 362 0.0183
0.6436 363 0.0106
0.6454 364 0.0523
0.6472 365 0.1742
0.6489 366 0.1366
0.6507 367 0.2887
0.6525 368 0.0802
0.6543 369 0.0532
0.6560 370 0.1194
0.6578 371 0.0648
0.6596 372 0.1022
0.6613 373 0.0596
0.6631 374 0.1083
0.6649 375 0.0121
0.6667 376 0.0448
0.6684 377 0.0261
0.6702 378 0.1448
0.6720 379 0.0822
0.6738 380 0.0141
0.6755 381 0.0187
0.6773 382 0.0639
0.6791 383 0.3279
0.6809 384 0.0084
0.6826 385 0.0256
0.6844 386 0.0886
0.6862 387 0.0671
0.6879 388 0.0365
0.6897 389 0.0112
0.6915 390 0.018
0.6933 391 0.2417
0.6950 392 0.1742
0.6968 393 0.0083
0.6986 394 0.0202
0.7004 395 0.0371
0.7021 396 0.0249
0.7039 397 0.019
0.7057 398 0.0546
0.7074 399 0.0287
0.7092 400 0.0234
0.7110 401 0.005
0.7128 402 0.0089
0.7145 403 0.0097
0.7163 404 0.0545
0.7181 405 0.0079
0.7199 406 0.2158
0.7216 407 0.0216
0.7234 408 0.0033
0.7252 409 0.0313
0.7270 410 0.0527
0.7287 411 0.1268
0.7305 412 0.0025
0.7323 413 0.0597
0.7340 414 0.0291
0.7358 415 0.0219
0.7376 416 0.0818
0.7394 417 0.1946
0.7411 418 0.5806
0.7429 419 0.0348
0.7447 420 0.0138
0.7465 421 0.0445
0.7482 422 0.639
0.75 423 0.028
0.7518 424 0.1737
0.7535 425 0.0038
0.7553 426 0.014
0.7571 427 0.1141
0.7589 428 0.0936
0.7606 429 0.0724
0.7624 430 0.0438
0.7642 431 0.0044
0.7660 432 0.003
0.7677 433 0.0147
0.7695 434 0.1538
0.7713 435 0.0203
0.7730 436 0.0223
0.7748 437 0.0056
0.7766 438 0.0114
0.7784 439 0.0097
0.7801 440 0.0169
0.7819 441 0.0453
0.7837 442 0.1687
0.7855 443 0.1222
0.7872 444 0.0091
0.7890 445 0.0155
0.7908 446 0.1198
0.7926 447 0.0922
0.7943 448 0.017
0.7961 449 0.0853
0.7979 450 0.0946
0.7996 451 0.0558
0.8014 452 0.0229
0.8032 453 0.0062
0.8050 454 0.0175
0.8067 455 0.0339
0.8085 456 0.0445
0.8103 457 0.0411
0.8121 458 0.0037
0.8138 459 0.0244
0.8156 460 0.0358
0.8174 461 0.062
0.8191 462 0.0201
0.8209 463 0.0055
0.8227 464 0.152
0.8245 465 0.0032
0.8262 466 0.2056
0.8280 467 0.0245
0.8298 468 0.0239
0.8316 469 0.0323
0.8333 470 0.2737
0.8351 471 0.0205
0.8369 472 0.0037
0.8387 473 0.2092
0.8404 474 0.0659
0.8422 475 0.0361
0.8440 476 0.0845
0.8457 477 0.015
0.8475 478 0.0055
0.8493 479 0.0012
0.8511 480 0.0241
0.8528 481 0.1986
0.8546 482 0.1794
0.8564 483 0.0477
0.8582 484 0.1216
0.8599 485 0.0423
0.8617 486 0.0124
0.8635 487 0.0724
0.8652 488 0.3665
0.8670 489 0.0338
0.8688 490 0.0327
0.8706 491 0.0875
0.8723 492 0.1198
0.8741 493 0.0959
0.8759 494 0.4752
0.8777 495 0.0248
0.8794 496 0.0955
0.8812 497 0.0988
0.8830 498 0.0053
0.8848 499 0.2546
0.8865 500 0.2137
0.8883 501 0.0013
0.8901 502 0.0053
0.8918 503 0.0021
0.8936 504 0.0357
0.8954 505 0.1408
0.8972 506 0.0475
0.8989 507 0.0041
0.9007 508 0.1138
0.9025 509 0.1568
0.9043 510 0.0094
0.9060 511 0.0015
0.9078 512 0.028
0.9096 513 0.2884
0.9113 514 0.0929
0.9131 515 0.2441
0.9149 516 0.0067
0.9167 517 0.0327
0.9184 518 0.029
0.9202 519 0.0835
0.9220 520 0.006
0.9238 521 0.0103
0.9255 522 0.1339
0.9273 523 0.0084
0.9291 524 0.0101
0.9309 525 0.0053
0.9326 526 0.0236
0.9344 527 0.0927
0.9362 528 0.0636
0.9379 529 0.1854
0.9397 530 0.117
0.9415 531 0.0115
0.9433 532 0.1472
0.9450 533 0.0226
0.9468 534 0.0531
0.9486 535 0.0272
0.9504 536 0.0213
0.9521 537 0.008
0.9539 538 0.0244
0.9557 539 0.0061
0.9574 540 0.0987
0.9592 541 0.021
0.9610 542 0.0556
0.9628 543 0.0214
0.9645 544 0.1886
0.9663 545 0.1871
0.9681 546 0.1497
0.9699 547 0.2943
0.9716 548 0.0207
0.9734 549 0.0032
0.9752 550 0.066
0.9770 551 0.0986
0.9787 552 0.0255
0.9805 553 0.1584
0.9823 554 0.0939
0.9840 555 0.0543
0.9858 556 0.0293
0.9876 557 0.1172
0.9894 558 0.0345
0.9911 559 0.0188
0.9929 560 0.0108
0.9947 561 0.0069
0.9965 562 0.0965
0.9982 563 0.1211
1.0 564 0.0011
1.0018 565 0.002
1.0035 566 0.0409
1.0053 567 0.0062
1.0071 568 0.0074
1.0089 569 0.0012
1.0106 570 0.0454
1.0124 571 0.0017
1.0142 572 0.0727
1.0160 573 0.0096
1.0177 574 0.1944
1.0195 575 0.0129
1.0213 576 0.0077
1.0230 577 0.0203
1.0248 578 0.046
1.0266 579 0.0011
1.0284 580 0.0014
1.0301 581 0.002
1.0319 582 0.0362
1.0337 583 0.0023
1.0355 584 0.0055
1.0372 585 0.1081
1.0390 586 0.1659
1.0408 587 0.012
1.0426 588 0.0225
1.0443 589 0.1943
1.0461 590 0.0045
1.0479 591 0.0024
1.0496 592 0.1368
1.0514 593 0.0895
1.0532 594 0.2384
1.0550 595 0.0842
1.0567 596 0.0669
1.0585 597 0.0039
1.0603 598 0.0031
1.0621 599 0.0044
1.0638 600 0.1103
1.0656 601 0.0232
1.0674 602 0.0644
1.0691 603 0.0104
1.0709 604 0.0383
1.0727 605 0.1454
1.0745 606 0.0123
1.0762 607 0.0094
1.0780 608 0.0247
1.0798 609 0.0473
1.0816 610 0.0212
1.0833 611 0.0506
1.0851 612 0.0854
1.0869 613 0.021
1.0887 614 0.012
1.0904 615 0.012
1.0922 616 0.1787
1.0940 617 0.0229
1.0957 618 0.0123
1.0975 619 0.0381
1.0993 620 0.1896
1.1011 621 0.1764
1.1028 622 0.0046
1.1046 623 0.0075
1.1064 624 0.013
1.1082 625 0.0592
1.1099 626 0.0127
1.1117 627 0.0952
1.1135 628 0.0051
1.1152 629 0.1906
1.1170 630 0.0105
1.1188 631 0.0526
1.1206 632 0.1145
1.1223 633 0.0086
1.1241 634 0.0669
1.1259 635 0.0183
1.1277 636 0.0424
1.1294 637 0.0444
1.1312 638 0.0085
1.1330 639 0.0057
1.1348 640 0.0067
1.1365 641 0.0007
1.1383 642 0.0052
1.1401 643 0.0066
1.1418 644 0.0005
1.1436 645 0.0011
1.1454 646 0.0872
1.1472 647 0.0125
1.1489 648 0.0985
1.1507 649 0.0628
1.1525 650 0.0313
1.1543 651 0.0083
1.1560 652 0.0379
1.1578 653 0.0314
1.1596 654 0.0029
1.1613 655 0.0078
1.1631 656 0.1272
1.1649 657 0.0167
1.1667 658 0.12
1.1684 659 0.0224
1.1702 660 0.0193
1.1720 661 0.0104
1.1738 662 0.022
1.1755 663 0.1915
1.1773 664 0.0466
1.1791 665 0.024
1.1809 666 0.0385
1.1826 667 0.0914
1.1844 668 0.0364
1.1862 669 0.0165
1.1879 670 0.003
1.1897 671 0.0111
1.1915 672 0.0097
1.1933 673 0.0354
1.1950 674 0.0496
1.1968 675 0.0767
1.1986 676 0.0138
1.2004 677 0.0441
1.2021 678 0.0036
1.2039 679 0.0078
1.2057 680 0.0104
1.2074 681 0.0121
1.2092 682 0.1018
1.2110 683 0.0146
1.2128 684 0.0025
1.2145 685 0.0145
1.2163 686 0.0205
1.2181 687 0.124
1.2199 688 0.0165
1.2216 689 0.1345
1.2234 690 0.0104
1.2252 691 0.0056
1.2270 692 0.001
1.2287 693 0.0047
1.2305 694 0.0218
1.2323 695 0.0161
1.2340 696 0.0163
1.2358 697 0.0214
1.2376 698 0.0059
1.2394 699 0.001
1.2411 700 0.0069
1.2429 701 0.0011
1.2447 702 0.0345
1.2465 703 0.0061
1.2482 704 0.1855
1.25 705 0.0193
1.2518 706 0.0076
1.2535 707 0.1165
1.2553 708 0.0278
1.2571 709 0.0039
1.2589 710 0.0241
1.2606 711 0.0419
1.2624 712 0.0079
1.2642 713 0.0148
1.2660 714 0.0333
1.2677 715 0.0133
1.2695 716 0.2561
1.2713 717 0.0353
1.2730 718 0.0035
1.2748 719 0.0142
1.2766 720 0.0843
1.2784 721 0.0074
1.2801 722 0.0117
1.2819 723 0.014
1.2837 724 0.0197
1.2855 725 0.0235
1.2872 726 0.0243
1.2890 727 0.0023
1.2908 728 0.0048
1.2926 729 0.056
1.2943 730 0.0517
1.2961 731 0.0073
1.2979 732 0.2383
1.2996 733 0.0165
1.3014 734 0.0703
1.3032 735 0.0091
1.3050 736 0.0447
1.3067 737 0.0504
1.3085 738 0.0279
1.3103 739 0.257
1.3121 740 0.0372
1.3138 741 0.0111
1.3156 742 0.0229
1.3174 743 0.062
1.3191 744 0.0186
1.3209 745 0.05
1.3227 746 0.0029
1.3245 747 0.0355
1.3262 748 0.097
1.3280 749 0.1409
1.3298 750 0.0811
1.3316 751 0.0475
1.3333 752 0.0023
1.3351 753 0.0034
1.3369 754 0.0022
1.3387 755 0.0307
1.3404 756 0.1478
1.3422 757 0.0311
1.3440 758 0.0016
1.3457 759 0.018
1.3475 760 0.0024
1.3493 761 0.0067
1.3511 762 0.0209
1.3528 763 0.0405
1.3546 764 0.093
1.3564 765 0.0069
1.3582 766 0.0552
1.3599 767 0.011
1.3617 768 0.0035
1.3635 769 0.014
1.3652 770 0.0235
1.3670 771 0.0304
1.3688 772 0.019
1.3706 773 0.0307
1.3723 774 0.0089
1.3741 775 0.0035
1.3759 776 0.0021
1.3777 777 0.0014
1.3794 778 0.0068
1.3812 779 0.0065
1.3830 780 0.0176
1.3848 781 0.0297
1.3865 782 0.0025
1.3883 783 0.0102
1.3901 784 0.0141
1.3918 785 0.0854
1.3936 786 0.0044
1.3954 787 0.0287
1.3972 788 0.0145
1.3989 789 0.0055
1.4007 790 0.0121
1.4025 791 0.0038
1.4043 792 0.1916
1.4060 793 0.0804
1.4078 794 0.1413
1.4096 795 0.0272
1.4113 796 0.0349
1.4131 797 0.0203
1.4149 798 0.0053
1.4167 799 0.0008
1.4184 800 0.0259
1.4202 801 0.0209
1.4220 802 0.1249
1.4238 803 0.4471
1.4255 804 0.012
1.4273 805 0.1615
1.4291 806 0.0353
1.4309 807 0.0295
1.4326 808 0.0089
1.4344 809 0.0033
1.4362 810 0.0012
1.4379 811 0.0091
1.4397 812 0.0327
1.4415 813 0.0829
1.4433 814 0.1153
1.4450 815 0.013
1.4468 816 0.041
1.4486 817 0.003
1.4504 818 0.2116
1.4521 819 0.0278
1.4539 820 0.0026
1.4557 821 0.1155
1.4574 822 0.0901
1.4592 823 0.0081
1.4610 824 0.0013
1.4628 825 0.0867
1.4645 826 0.0798
1.4663 827 0.0015
1.4681 828 0.0025
1.4699 829 0.0063
1.4716 830 0.0102
1.4734 831 0.0041
1.4752 832 0.021
1.4770 833 0.0392
1.4787 834 0.0058
1.4805 835 0.0086
1.4823 836 0.0084
1.4840 837 0.0568
1.4858 838 0.0127
1.4876 839 0.0653
1.4894 840 0.0042
1.4911 841 0.0164
1.4929 842 0.026
1.4947 843 0.0515
1.4965 844 0.0074
1.4982 845 0.0254
1.5 846 0.0906
1.5018 847 0.0311
1.5035 848 0.0096
1.5053 849 0.0909
1.5071 850 0.0124
1.5089 851 0.0373
1.5106 852 0.001
1.5124 853 0.0202
1.5142 854 0.1159
1.5160 855 0.0006
1.5177 856 0.0211
1.5195 857 0.0173
1.5213 858 0.0029
1.5230 859 0.0107
1.5248 860 0.0249
1.5266 861 0.0071
1.5284 862 0.0392
1.5301 863 0.0051
1.5319 864 0.0157
1.5337 865 0.2098
1.5355 866 0.1102
1.5372 867 0.0141
1.5390 868 0.0158
1.5408 869 0.0014
1.5426 870 0.0045
1.5443 871 0.0085
1.5461 872 0.0184
1.5479 873 0.0147
1.5496 874 0.0018
1.5514 875 0.0235
1.5532 876 0.0464
1.5550 877 0.0249
1.5567 878 0.0027
1.5585 879 0.0209
1.5603 880 0.0672
1.5621 881 0.0032
1.5638 882 0.0032
1.5656 883 0.0297
1.5674 884 0.0121
1.5691 885 0.0192
1.5709 886 0.0153
1.5727 887 0.0016
1.5745 888 0.041
1.5762 889 0.099
1.5780 890 0.1625
1.5798 891 0.0037
1.5816 892 0.1435
1.5833 893 0.2743
1.5851 894 0.0027
1.5869 895 0.01
1.5887 896 0.0556
1.5904 897 0.0019
1.5922 898 0.0127
1.5940 899 0.0183
1.5957 900 0.0128
1.5975 901 0.0136
1.5993 902 0.0423
1.6011 903 0.0053
1.6028 904 0.0356
1.6046 905 0.1253
1.6064 906 0.0055
1.6082 907 0.0966
1.6099 908 0.0426
1.6117 909 0.1751
1.6135 910 0.0049
1.6152 911 0.0591
1.6170 912 0.0198
1.6188 913 0.2293
1.6206 914 0.0449
1.6223 915 0.0107
1.6241 916 0.0974
1.6259 917 0.001
1.6277 918 0.0063
1.6294 919 0.0022
1.6312 920 0.1739
1.6330 921 0.005
1.6348 922 0.0028
1.6365 923 0.1195
1.6383 924 0.0656
1.6401 925 0.0033
1.6418 926 0.0253
1.6436 927 0.0222
1.6454 928 0.0102
1.6472 929 0.0006
1.6489 930 0.0021
1.6507 931 0.0111
1.6525 932 0.0087
1.6543 933 0.0154
1.6560 934 0.0225
1.6578 935 0.0215
1.6596 936 0.004
1.6613 937 0.0041
1.6631 938 0.0129
1.6649 939 0.0356
1.6667 940 0.0339
1.6684 941 0.0185
1.6702 942 0.0157
1.6720 943 0.0585
1.6738 944 0.0961
1.6755 945 0.0031
1.6773 946 0.004
1.6791 947 0.0169
1.6809 948 0.0555
1.6826 949 0.0052
1.6844 950 0.0065
1.6862 951 0.126
1.6879 952 0.0052
1.6897 953 0.0045
1.6915 954 0.0806
1.6933 955 0.0513
1.6950 956 0.1021
1.6968 957 0.0233
1.6986 958 0.0068
1.7004 959 0.0019
1.7021 960 0.0256
1.7039 961 0.06
1.7057 962 0.0452
1.7074 963 0.102
1.7092 964 0.0588
1.7110 965 0.1179
1.7128 966 0.0052
1.7145 967 0.0545
1.7163 968 0.0028
1.7181 969 0.0215
1.7199 970 0.0136
1.7216 971 0.0204
1.7234 972 0.0246
1.7252 973 0.0024
1.7270 974 0.1334
1.7287 975 0.0071
1.7305 976 0.001
1.7323 977 0.0013
1.7340 978 0.0065
1.7358 979 0.009
1.7376 980 0.0033
1.7394 981 0.0055
1.7411 982 0.0028
1.7429 983 0.0052
1.7447 984 0.0182
1.7465 985 0.0459
1.7482 986 0.0023
1.75 987 0.0823
1.7518 988 0.0758
1.7535 989 0.0186
1.7553 990 0.0198
1.7571 991 0.0043
1.7589 992 0.0077
1.7606 993 0.0606
1.7624 994 0.0368
1.7642 995 0.0061
1.7660 996 0.0142
1.7677 997 0.0049
1.7695 998 0.0074
1.7713 999 0.093
1.7730 1000 0.1129
1.7748 1001 0.0008
1.7766 1002 0.1378
1.7784 1003 0.0116
1.7801 1004 0.0024
1.7819 1005 0.0235
1.7837 1006 0.0134
1.7855 1007 0.0087
1.7872 1008 0.0445
1.7890 1009 0.0089
1.7908 1010 0.0395
1.7926 1011 0.001
1.7943 1012 0.0072
1.7961 1013 0.215
1.7979 1014 0.0008
1.7996 1015 0.0047
1.8014 1016 0.0195
1.8032 1017 0.0041
1.8050 1018 0.0934
1.8067 1019 0.0008
1.8085 1020 0.0302
1.8103 1021 0.1175
1.8121 1022 0.0717
1.8138 1023 0.0009
1.8156 1024 0.0016
1.8174 1025 0.2146
1.8191 1026 0.0139
1.8209 1027 0.0067
1.8227 1028 0.054
1.8245 1029 0.0097
1.8262 1030 0.0353
1.8280 1031 0.029
1.8298 1032 0.093
1.8316 1033 0.0028
1.8333 1034 0.1996
1.8351 1035 0.0838
1.8369 1036 0.0651
1.8387 1037 0.3878
1.8404 1038 0.0232
1.8422 1039 0.0141
1.8440 1040 0.0039
1.8457 1041 0.0456
1.8475 1042 0.0093
1.8493 1043 0.0142
1.8511 1044 0.0092
1.8528 1045 0.0492
1.8546 1046 0.0503
1.8564 1047 0.035
1.8582 1048 0.1337
1.8599 1049 0.0038
1.8617 1050 0.003
1.8635 1051 0.0156
1.8652 1052 0.0141
1.8670 1053 0.1854
1.8688 1054 0.0029
1.8706 1055 0.0523
1.8723 1056 0.0313
1.8741 1057 0.0539
1.8759 1058 0.0044
1.8777 1059 0.1037
1.8794 1060 0.1125
1.8812 1061 0.031
1.8830 1062 0.0187
1.8848 1063 0.1745
1.8865 1064 0.0048
1.8883 1065 0.0138
1.8901 1066 0.0112
1.8918 1067 0.0005
1.8936 1068 0.0133
1.8954 1069 0.0411
1.8972 1070 0.0063
1.8989 1071 0.0007
1.9007 1072 0.063
1.9025 1073 0.343
1.9043 1074 0.0014
1.9060 1075 0.0194
1.9078 1076 0.0085
1.9096 1077 0.0067
1.9113 1078 0.0204
1.9131 1079 0.0094
1.9149 1080 0.2565
1.9167 1081 0.0456
1.9184 1082 0.0695
1.9202 1083 0.0047
1.9220 1084 0.0246
1.9238 1085 0.0033
1.9255 1086 0.0121
1.9273 1087 0.0148
1.9291 1088 0.0058
1.9309 1089 0.0019
1.9326 1090 0.0012
1.9344 1091 0.0093
1.9362 1092 0.0081
1.9379 1093 0.2302
1.9397 1094 0.0187
1.9415 1095 0.0013
1.9433 1096 0.0545
1.9450 1097 0.0121
1.9468 1098 0.008
1.9486 1099 0.0114
1.9504 1100 0.0938
1.9521 1101 0.0557
1.9539 1102 0.0522
1.9557 1103 0.2804
1.9574 1104 0.0126
1.9592 1105 0.0515
1.9610 1106 0.0458
1.9628 1107 0.0226
1.9645 1108 0.009
1.9663 1109 0.0154
1.9681 1110 0.0059
1.9699 1111 0.0013
1.9716 1112 0.0274
1.9734 1113 0.0194
1.9752 1114 0.0015
1.9770 1115 0.0013
1.9787 1116 0.0509
1.9805 1117 0.0038
1.9823 1118 0.0144
1.9840 1119 0.0009
1.9858 1120 0.0161
1.9876 1121 0.0494
1.9894 1122 0.0037
1.9911 1123 0.0084
1.9929 1124 0.0304
1.9947 1125 0.1233
1.9965 1126 0.0128
1.9982 1127 0.0031
2.0 1128 0.0021
2.0018 1129 0.0326
2.0035 1130 0.0091
2.0053 1131 0.0197
2.0071 1132 0.0184
2.0089 1133 0.0785
2.0106 1134 0.0013
2.0124 1135 0.0203
2.0142 1136 0.0527
2.0160 1137 0.2003
2.0177 1138 0.0256
2.0195 1139 0.0348
2.0213 1140 0.0064
2.0230 1141 0.0192
2.0248 1142 0.0011
2.0266 1143 0.0166
2.0284 1144 0.0069
2.0301 1145 0.0012
2.0319 1146 0.0021
2.0337 1147 0.0111
2.0355 1148 0.0307
2.0372 1149 0.0553
2.0390 1150 0.0178
2.0408 1151 0.0214
2.0426 1152 0.0115
2.0443 1153 0.0836
2.0461 1154 0.0008
2.0479 1155 0.002
2.0496 1156 0.0013
2.0514 1157 0.1271
2.0532 1158 0.0169
2.0550 1159 0.0895
2.0567 1160 0.1264
2.0585 1161 0.0126
2.0603 1162 0.0033
2.0621 1163 0.0056
2.0638 1164 0.0095
2.0656 1165 0.0561
2.0674 1166 0.001
2.0691 1167 0.0119
2.0709 1168 0.0016
2.0727 1169 0.0184
2.0745 1170 0.1006
2.0762 1171 0.2481
2.0780 1172 0.0295
2.0798 1173 0.0054
2.0816 1174 0.0028
2.0833 1175 0.0251
2.0851 1176 0.0066
2.0869 1177 0.0915
2.0887 1178 0.0259
2.0904 1179 0.0038
2.0922 1180 0.0351
2.0940 1181 0.0073
2.0957 1182 0.0009
2.0975 1183 0.0026
2.0993 1184 0.0013
2.1011 1185 0.1223
2.1028 1186 0.0057
2.1046 1187 0.0056
2.1064 1188 0.004
2.1082 1189 0.0064
2.1099 1190 0.0951
2.1117 1191 0.0328
2.1135 1192 0.0422
2.1152 1193 0.003
2.1170 1194 0.0199
2.1188 1195 0.0024
2.1206 1196 0.0493
2.1223 1197 0.0532
2.1241 1198 0.0006
2.1259 1199 0.0039
2.1277 1200 0.0067
2.1294 1201 0.0169
2.1312 1202 0.0012
2.1330 1203 0.002
2.1348 1204 0.0787
2.1365 1205 0.032
2.1383 1206 0.0018
2.1401 1207 0.0014
2.1418 1208 0.0073
2.1436 1209 0.0256
2.1454 1210 0.0073
2.1472 1211 0.0006
2.1489 1212 0.0112
2.1507 1213 0.0116
2.1525 1214 0.0044
2.1543 1215 0.0033
2.1560 1216 0.0094
2.1578 1217 0.0823
2.1596 1218 0.0064
2.1613 1219 0.0052
2.1631 1220 0.0056
2.1649 1221 0.0205
2.1667 1222 0.0508
2.1684 1223 0.0069
2.1702 1224 0.0813
2.1720 1225 0.022
2.1738 1226 0.0254
2.1755 1227 0.0119
2.1773 1228 0.001
2.1791 1229 0.0074
2.1809 1230 0.0104
2.1826 1231 0.0034
2.1844 1232 0.003
2.1862 1233 0.0026
2.1879 1234 0.0005
2.1897 1235 0.0021
2.1915 1236 0.0034
2.1933 1237 0.1037
2.1950 1238 0.0067
2.1968 1239 0.0349
2.1986 1240 0.0699
2.2004 1241 0.0201
2.2021 1242 0.0079
2.2039 1243 0.0335
2.2057 1244 0.0465
2.2074 1245 0.0144
2.2092 1246 0.1061
2.2110 1247 0.0078
2.2128 1248 0.0027
2.2145 1249 0.0019
2.2163 1250 0.0019
2.2181 1251 0.0109
2.2199 1252 0.0029
2.2216 1253 0.0032
2.2234 1254 0.0039
2.2252 1255 0.0082
2.2270 1256 0.0157
2.2287 1257 0.0027
2.2305 1258 0.0025
2.2323 1259 0.0301
2.2340 1260 0.1471
2.2358 1261 0.0021
2.2376 1262 0.0087
2.2394 1263 0.0109
2.2411 1264 0.2735
2.2429 1265 0.0109
2.2447 1266 0.0042
2.2465 1267 0.0301
2.2482 1268 0.1398
2.25 1269 0.0137
2.2518 1270 0.0059
2.2535 1271 0.0076
2.2553 1272 0.0023
2.2571 1273 0.0281
2.2589 1274 0.0012
2.2606 1275 0.0032
2.2624 1276 0.0151
2.2642 1277 0.0021
2.2660 1278 0.001
2.2677 1279 0.0258
2.2695 1280 0.26
2.2713 1281 0.0036
2.2730 1282 0.0005
2.2748 1283 0.0038
2.2766 1284 0.0016
2.2784 1285 0.0401
2.2801 1286 0.0028
2.2819 1287 0.008
2.2837 1288 0.0077
2.2855 1289 0.0133
2.2872 1290 0.0578
2.2890 1291 0.0008
2.2908 1292 0.0051
2.2926 1293 0.0036
2.2943 1294 0.047
2.2961 1295 0.0026
2.2979 1296 0.0109
2.2996 1297 0.0432
2.3014 1298 0.0184
2.3032 1299 0.0483
2.3050 1300 0.0101
2.3067 1301 0.0098
2.3085 1302 0.0232
2.3103 1303 0.0105
2.3121 1304 0.0062
2.3138 1305 0.0541
2.3156 1306 0.0646
2.3174 1307 0.0084
2.3191 1308 0.0313
2.3209 1309 0.0081
2.3227 1310 0.012
2.3245 1311 0.0036
2.3262 1312 0.0518
2.3280 1313 0.0018
2.3298 1314 0.0044
2.3316 1315 0.0495
2.3333 1316 0.0733
2.3351 1317 0.0478
2.3369 1318 0.0408
2.3387 1319 0.0657
2.3404 1320 0.0007
2.3422 1321 0.0286
2.3440 1322 0.0145
2.3457 1323 0.0028
2.3475 1324 0.013
2.3493 1325 0.0088
2.3511 1326 0.0091
2.3528 1327 0.2375
2.3546 1328 0.0332
2.3564 1329 0.1036
2.3582 1330 0.0073
2.3599 1331 0.0177
2.3617 1332 0.0008
2.3635 1333 0.011
2.3652 1334 0.0228
2.3670 1335 0.0183
2.3688 1336 0.0011
2.3706 1337 0.0178
2.3723 1338 0.2155
2.3741 1339 0.0048
2.3759 1340 0.0854
2.3777 1341 0.0146
2.3794 1342 0.0034
2.3812 1343 0.0105
2.3830 1344 0.0181
2.3848 1345 0.0126
2.3865 1346 0.0555
2.3883 1347 0.1284
2.3901 1348 0.0071
2.3918 1349 0.0007
2.3936 1350 0.003
2.3954 1351 0.013
2.3972 1352 0.0023
2.3989 1353 0.0083
2.4007 1354 0.0217
2.4025 1355 0.2555
2.4043 1356 0.0171
2.4060 1357 0.0028
2.4078 1358 0.0796
2.4096 1359 0.0054
2.4113 1360 0.1113
2.4131 1361 0.0291
2.4149 1362 0.0186
2.4167 1363 0.0248
2.4184 1364 0.0281
2.4202 1365 0.0386
2.4220 1366 0.0049
2.4238 1367 0.0023
2.4255 1368 0.0229
2.4273 1369 0.0043
2.4291 1370 0.0351
2.4309 1371 0.003
2.4326 1372 0.0593
2.4344 1373 0.0746
2.4362 1374 0.1464
2.4379 1375 0.0143
2.4397 1376 0.0871
2.4415 1377 0.034
2.4433 1378 0.0096
2.4450 1379 0.0507
2.4468 1380 0.0248
2.4486 1381 0.0131
2.4504 1382 0.0123
2.4521 1383 0.0303
2.4539 1384 0.0013
2.4557 1385 0.0902
2.4574 1386 0.0375
2.4592 1387 0.0978
2.4610 1388 0.0151
2.4628 1389 0.0139
2.4645 1390 0.1327
2.4663 1391 0.0248
2.4681 1392 0.0086
2.4699 1393 0.0006
2.4716 1394 0.0153
2.4734 1395 0.3766
2.4752 1396 0.0252
2.4770 1397 0.1675
2.4787 1398 0.0018
2.4805 1399 0.0526
2.4823 1400 0.0191
2.4840 1401 0.0077
2.4858 1402 0.0011
2.4876 1403 0.0261
2.4894 1404 0.0028
2.4911 1405 0.0012
2.4929 1406 0.0011
2.4947 1407 0.0015
2.4965 1408 0.0183
2.4982 1409 0.0376
2.5 1410 0.0343
2.5018 1411 0.0184
2.5035 1412 0.0068
2.5053 1413 0.0044
2.5071 1414 0.04
2.5089 1415 0.1035
2.5106 1416 0.0018
2.5124 1417 0.0578
2.5142 1418 0.0039
2.5160 1419 0.0002
2.5177 1420 0.0022
2.5195 1421 0.0005
2.5213 1422 0.0064
2.5230 1423 0.0239
2.5248 1424 0.0209
2.5266 1425 0.0026
2.5284 1426 0.0019
2.5301 1427 0.1177
2.5319 1428 0.0007
2.5337 1429 0.0173
2.5355 1430 0.0744
2.5372 1431 0.0078
2.5390 1432 0.0025
2.5408 1433 0.003
2.5426 1434 0.0116
2.5443 1435 0.0016
2.5461 1436 0.0018
2.5479 1437 0.0636
2.5496 1438 0.0021
2.5514 1439 0.0008
2.5532 1440 0.0048
2.5550 1441 0.0116
2.5567 1442 0.0701
2.5585 1443 0.003
2.5603 1444 0.0051
2.5621 1445 0.0265
2.5638 1446 0.0297
2.5656 1447 0.0062
2.5674 1448 0.0193
2.5691 1449 0.0042
2.5709 1450 0.0075
2.5727 1451 0.0033
2.5745 1452 0.0078
2.5762 1453 0.0662
2.5780 1454 0.0103
2.5798 1455 0.0138
2.5816 1456 0.0049
2.5833 1457 0.0023
2.5851 1458 0.0463
2.5869 1459 0.0539
2.5887 1460 0.0112
2.5904 1461 0.0088
2.5922 1462 0.0096
2.5940 1463 0.0063
2.5957 1464 0.004
2.5975 1465 0.0753
2.5993 1466 0.0013
2.6011 1467 0.0052
2.6028 1468 0.0162
2.6046 1469 0.0015
2.6064 1470 0.0194
2.6082 1471 0.0166
2.6099 1472 0.0015
2.6117 1473 0.0045
2.6135 1474 0.0275
2.6152 1475 0.192
2.6170 1476 0.0113
2.6188 1477 0.0165
2.6206 1478 0.0037
2.6223 1479 0.0031
2.6241 1480 0.0522
2.6259 1481 0.0251
2.6277 1482 0.0531
2.6294 1483 0.0165
2.6312 1484 0.0087
2.6330 1485 0.0982
2.6348 1486 0.0813
2.6365 1487 0.0023
2.6383 1488 0.0656
2.6401 1489 0.0128
2.6418 1490 0.053
2.6436 1491 0.0023
2.6454 1492 0.0314
2.6472 1493 0.0018
2.6489 1494 0.2133
2.6507 1495 0.02
2.6525 1496 0.0149
2.6543 1497 0.0045
2.6560 1498 0.2646
2.6578 1499 0.007
2.6596 1500 0.0031
2.6613 1501 0.0681
2.6631 1502 0.0075
2.6649 1503 0.0009
2.6667 1504 0.0212
2.6684 1505 0.0013
2.6702 1506 0.0118
2.6720 1507 0.0002
2.6738 1508 0.0069
2.6755 1509 0.0119
2.6773 1510 0.0193
2.6791 1511 0.0015
2.6809 1512 0.0486
2.6826 1513 0.156
2.6844 1514 0.02
2.6862 1515 0.0225
2.6879 1516 0.0024
2.6897 1517 0.0272
2.6915 1518 0.0115
2.6933 1519 0.0141
2.6950 1520 0.0155
2.6968 1521 0.0239
2.6986 1522 0.0088
2.7004 1523 0.0131
2.7021 1524 0.0035
2.7039 1525 0.3601
2.7057 1526 0.0384
2.7074 1527 0.0054
2.7092 1528 0.0023
2.7110 1529 0.0008
2.7128 1530 0.0622
2.7145 1531 0.0068
2.7163 1532 0.005
2.7181 1533 0.0466
2.7199 1534 0.0025
2.7216 1535 0.0124
2.7234 1536 0.0059
2.7252 1537 0.0068
2.7270 1538 0.0418
2.7287 1539 0.0108
2.7305 1540 0.0112
2.7323 1541 0.0085
2.7340 1542 0.0032
2.7358 1543 0.052
2.7376 1544 0.0423
2.7394 1545 0.0096
2.7411 1546 0.0291
2.7429 1547 0.0444
2.7447 1548 0.0047
2.7465 1549 0.0273
2.7482 1550 0.0106
2.75 1551 0.1274
2.7518 1552 0.0065
2.7535 1553 0.0033
2.7553 1554 0.0012
2.7571 1555 0.009
2.7589 1556 0.1048
2.7606 1557 0.0149
2.7624 1558 0.0807
2.7642 1559 0.0807
2.7660 1560 0.0103
2.7677 1561 0.038
2.7695 1562 0.0068
2.7713 1563 0.0529
2.7730 1564 0.1415
2.7748 1565 0.0168
2.7766 1566 0.0016
2.7784 1567 0.0017
2.7801 1568 0.0223
2.7819 1569 0.0137
2.7837 1570 0.0051
2.7855 1571 0.0054
2.7872 1572 0.0206
2.7890 1573 0.0465
2.7908 1574 0.0031
2.7926 1575 0.0006
2.7943 1576 0.0047
2.7961 1577 0.0086
2.7979 1578 0.0443
2.7996 1579 0.0099
2.8014 1580 0.0878
2.8032 1581 0.0042
2.8050 1582 0.1406
2.8067 1583 0.0034
2.8085 1584 0.0085
2.8103 1585 0.0118
2.8121 1586 0.0182
2.8138 1587 0.0013
2.8156 1588 0.0049
2.8174 1589 0.0104
2.8191 1590 0.0068
2.8209 1591 0.0017
2.8227 1592 0.004
2.8245 1593 0.0048
2.8262 1594 0.0253
2.8280 1595 0.0672
2.8298 1596 0.0008
2.8316 1597 0.0086
2.8333 1598 0.01
2.8351 1599 0.0165
2.8369 1600 0.1176
2.8387 1601 0.0025
2.8404 1602 0.0068
2.8422 1603 0.0829
2.8440 1604 0.0037
2.8457 1605 0.0347
2.8475 1606 0.0046
2.8493 1607 0.0129
2.8511 1608 0.0325
2.8528 1609 0.0039
2.8546 1610 0.0414
2.8564 1611 0.102
2.8582 1612 0.0935
2.8599 1613 0.0031
2.8617 1614 0.0125
2.8635 1615 0.0011
2.8652 1616 0.0041
2.8670 1617 0.0411
2.8688 1618 0.0029
2.8706 1619 0.1064
2.8723 1620 0.0229
2.8741 1621 0.0142
2.8759 1622 0.0847
2.8777 1623 0.0743
2.8794 1624 0.0019
2.8812 1625 0.0194
2.8830 1626 0.0019
2.8848 1627 0.0143
2.8865 1628 0.0011
2.8883 1629 0.0144
2.8901 1630 0.154
2.8918 1631 0.0092
2.8936 1632 0.0086
2.8954 1633 0.0052
2.8972 1634 0.1818
2.8989 1635 0.0022
2.9007 1636 0.003
2.9025 1637 0.0021
2.9043 1638 0.0091
2.9060 1639 0.0369
2.9078 1640 0.0007
2.9096 1641 0.007
2.9113 1642 0.0071
2.9131 1643 0.0345
2.9149 1644 0.0068
2.9167 1645 0.0063
2.9184 1646 0.0039
2.9202 1647 0.0262
2.9220 1648 0.0653
2.9238 1649 0.0144
2.9255 1650 0.014
2.9273 1651 0.0014
2.9291 1652 0.011
2.9309 1653 0.0104
2.9326 1654 0.0073
2.9344 1655 0.0245
2.9362 1656 0.1735
2.9379 1657 0.0188
2.9397 1658 0.0149
2.9415 1659 0.0186
2.9433 1660 0.0397
2.9450 1661 0.0529
2.9468 1662 0.0345
2.9486 1663 0.0121
2.9504 1664 0.0802
2.9521 1665 0.0051
2.9539 1666 0.0734
2.9557 1667 0.0739
2.9574 1668 0.0191
2.9592 1669 0.0362
2.9610 1670 0.007
2.9628 1671 0.0064
2.9645 1672 0.2386
2.9663 1673 0.0224
2.9681 1674 0.0007
2.9699 1675 0.0019
2.9716 1676 0.0333
2.9734 1677 0.0067
2.9752 1678 0.0052
2.9770 1679 0.0028
2.9787 1680 0.0462
2.9805 1681 0.0072
2.9823 1682 0.0023
2.9840 1683 0.01
2.9858 1684 0.0208
2.9876 1685 0.0189
2.9894 1686 0.002
2.9911 1687 0.0021
2.9929 1688 0.0479
2.9947 1689 0.0159
2.9965 1690 0.0618
2.9982 1691 0.0267
3.0 1692 0.3215

Framework Versions

  • Python: 3.10.14
  • Sentence Transformers: 5.1.1
  • Transformers: 4.56.2
  • PyTorch: 2.8.0+cu128
  • Accelerate: 1.10.1
  • Datasets: 4.1.1
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
20
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Devy1/MiniLM-cosqa-16

Finetuned
(545)
this model

Collection including Devy1/MiniLM-cosqa-16