SentenceTransformer based on BAAI/bge-large-en-v1.5

This is a sentence-transformers model finetuned from BAAI/bge-large-en-v1.5. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-large-en-v1.5
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 1024 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("cyberbabooshka/bge_large_ft1")
# Run inference
sentences = [
    'What is the relationship between the smallest perturbation of a matrix and its rank, as established in theorems regarding matrix perturbations?',
    '"Suppose $A \\in C^{m \\times n}$ has full column rank (= n). Then $\\min _{\\Delta \\in \\mathbb{C}^{m \\times n}}\\left\\{\\|\\Delta\\|_{2} \\mid A+\\Delta \\text { has rank }<n\\right\\}=\\sigma_{n}(A)$."',
    '"If a beam of light enters and then exits the elevator, the observer on Earth and the one accelerating in empty space must observe the same thing, since they cannot distinguish between being on Earth or accelerating in space. The observer in space, who is accelerating, will observe that the beam of light bends as it crosses the elevator... that means that if the path of a beam of light is curved near Earth, it must be because space itself is curved in the presence of a gravitational field!"',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.6048
cosine_accuracy@3 0.7357
cosine_accuracy@5 0.7833
cosine_accuracy@10 0.8286
cosine_precision@1 0.6048
cosine_precision@3 0.2452
cosine_precision@5 0.1567
cosine_precision@10 0.0829
cosine_recall@1 0.6048
cosine_recall@3 0.7357
cosine_recall@5 0.7833
cosine_recall@10 0.8286
cosine_ndcg@10 0.7135
cosine_mrr@10 0.6768
cosine_map@100 0.6824

Training Details

Training Dataset

Unnamed Dataset

  • Size: 1,760 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 9 tokens
    • mean: 24.87 tokens
    • max: 70 tokens
    • min: 11 tokens
    • mean: 68.37 tokens
    • max: 500 tokens
  • Samples:
    anchor positive
    How is a proper coloring of a graph defined in the context of vertices and edges? "A coloring is called proper if for each edge joining two distinct vertices, the two vertices it joins have different colors."
    What is the relationship between the first excited state of the box model and the p orbitals in a hydrogen atom? "The p orbitals are similar to the first excited state of the box, i.e. $(n_{x},n_{y},n_{z})=(2,1,1)$ is similar to a $p_{x}$ orbital, $(n_{x},n_{y},n_{z})=(1,2,1)$ is similar to a $p_{y}$ orbital and $(n_{x},n_{y},n_{z})=(1,1,2)$ is similar to a $p_{z}$ orbital."
    How can the behavior of the derivative ( f'(x) ) indicate the presence of a local maximum or minimum at a critical point ( x=a )? "If there is a local maximum when ( x=a ), the function must be lower near ( x=a ) than it is right at ( x=a ). If the derivative exists near ( x=a ), this means ( f'(x)>0 ) when ( x ) is near ( a ) and ( x < a ), because the function must 'slope up' just to the left of ( a ). Similarly, ( f'(x) < 0 ) when ( x ) is near ( a ) and ( x>a ), because ( f ) slopes down from the local maximum as we move to the right. Using the same reasoning, if there is a local minimum at ( x=a ), the derivative of ( f ) must be negative just to the left of ( a ) and positive just to the right."
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

Unnamed Dataset

  • Size: 420 evaluation samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 420 samples:
    anchor positive
    type string string
    details
    • min: 12 tokens
    • mean: 24.97 tokens
    • max: 66 tokens
    • min: 7 tokens
    • mean: 68.52 tokens
    • max: 452 tokens
  • Samples:
    anchor positive
    What are the two central classes mentioned in the FileSystem framework and what do they represent? "The class FileReference is the most important entry point to the framework." and "FileSystem is a powerful and elegant library to manipulate files."
    What is the significance of Turing's work in the context of PDE-based models for self-organization of complex systems? "Turing’s monumental work on the chemical basis of morphogenesis played an important role in igniting researchers’ attention to the PDE-based continuous field models as a mathematical framework to study self-organization of complex systems."
    What are the two options for reducing accelerations as discussed in the passage? "From the above definitions we see that there are really two options for reducing accelerations. We can reduce the amount that velocity changes, or we can increase the time over which the velocity changes (or both)."
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • learning_rate: 2e-05
  • weight_decay: 0.05
  • num_train_epochs: 10
  • warmup_ratio: 0.1
  • fp16: True
  • eval_on_start: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.05
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: True
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss eval_cosine_ndcg@10
0 0 - 0.1387 0.6755
0.0091 1 0.1723 - -
0.0182 2 0.1424 - -
0.0273 3 0.094 - -
0.0364 4 0.171 - -
0.0455 5 0.1704 - -
0.0545 6 0.2255 - -
0.0636 7 0.0829 - -
0.0727 8 0.1805 - -
0.0818 9 0.2143 - -
0.0909 10 0.1342 - -
0.1 11 0.1318 - -
0.1091 12 0.1247 - -
0.1182 13 0.1459 - -
0.1273 14 0.1489 - -
0.1364 15 0.0667 - -
0.1455 16 0.1029 - -
0.1545 17 0.0576 - -
0.1636 18 0.0617 - -
0.1727 19 0.0622 - -
0.1818 20 0.1624 - -
0.1909 21 0.1393 - -
0.2 22 0.1337 - -
0.2091 23 0.0386 - -
0.2182 24 0.0573 - -
0.2273 25 0.095 - -
0.2364 26 0.2023 - -
0.2455 27 0.0318 - -
0.2545 28 0.1544 - -
0.2636 29 0.0148 - -
0.2727 30 0.0235 - -
0.2818 31 0.319 - -
0.2909 32 0.0533 - -
0.3 33 0.1831 - -
0.3091 34 0.0278 - -
0.3182 35 0.0693 - -
0.3273 36 0.0296 - -
0.3364 37 0.0131 - -
0.3455 38 0.0279 - -
0.3545 39 0.0193 - -
0.3636 40 0.0193 - -
0.3727 41 0.0884 - -
0.3818 42 0.0248 - -
0.3909 43 0.0223 - -
0.4 44 0.0185 - -
0.4091 45 0.0243 - -
0.4182 46 0.1678 - -
0.4273 47 0.0174 - -
0.4364 48 0.1237 - -
0.4455 49 0.0601 - -
0.4545 50 0.0213 - -
0.4636 51 0.0171 - -
0.4727 52 0.0724 - -
0.4818 53 0.0798 - -
0.4909 54 0.0051 - -
0.5 55 0.0035 - -
0.5091 56 0.0143 - -
0.5182 57 0.0113 - -
0.5273 58 0.0393 - -
0.5364 59 0.0325 - -
0.5455 60 0.0287 - -
0.5545 61 0.0143 - -
0.5636 62 0.0124 - -
0.5727 63 0.0055 - -
0.5818 64 0.0215 - -
0.5909 65 0.1446 - -
0.6 66 0.3691 - -
0.6091 67 0.0713 - -
0.6182 68 0.03 - -
0.6273 69 0.0348 - -
0.6364 70 0.0091 - -
0.6455 71 0.0321 - -
0.6545 72 0.1088 - -
0.6636 73 0.0982 - -
0.6727 74 0.0311 - -
0.6818 75 0.0727 - -
0.6909 76 0.015 - -
0.7 77 0.1231 - -
0.7091 78 0.0963 - -
0.7182 79 0.0232 - -
0.7273 80 0.0256 - -
0.7364 81 0.0176 - -
0.7455 82 0.0494 - -
0.7545 83 0.0306 - -
0.7636 84 0.0427 - -
0.7727 85 0.1109 - -
0.7818 86 0.0317 - -
0.7909 87 0.024 - -
0.8 88 0.0093 - -
0.8091 89 0.0224 - -
0.8182 90 0.0283 - -
0.8273 91 0.1035 - -
0.8364 92 0.0067 - -
0.8455 93 0.0066 - -
0.8545 94 0.0649 - -
0.8636 95 0.0054 - -
0.8727 96 0.008 - -
0.8818 97 0.0114 - -
0.8909 98 0.0083 - -
0.9 99 0.0472 - -
0.9091 100 0.159 - -
0.9182 101 0.032 - -
0.9273 102 0.0335 - -
0.9364 103 0.0059 - -
0.9455 104 0.0038 - -
0.9545 105 0.0194 - -
0.9636 106 0.0233 - -
0.9727 107 0.0699 - -
0.9818 108 0.0042 - -
0.9909 109 0.0168 - -
1.0 110 0.0175 0.0277 0.7131
1.0091 111 0.0185 - -
1.0182 112 0.0133 - -
1.0273 113 0.0046 - -
1.0364 114 0.0074 - -
1.0455 115 0.0063 - -
1.0545 116 0.0082 - -
1.0636 117 0.0745 - -
1.0727 118 0.0117 - -
1.0818 119 0.0123 - -
1.0909 120 0.0226 - -
1.1 121 0.0031 - -
1.1091 122 0.0031 - -
1.1182 123 0.0064 - -
1.1273 124 0.0048 - -
1.1364 125 0.0037 - -
1.1455 126 0.04 - -
1.1545 127 0.0026 - -
1.1636 128 0.0043 - -
1.1727 129 0.1271 - -
1.1818 130 0.0026 - -
1.1909 131 0.0027 - -
1.2 132 0.0699 - -
1.2091 133 0.0321 - -
1.2182 134 0.0026 - -
1.2273 135 0.1767 - -
1.2364 136 0.0323 - -
1.2455 137 0.0115 - -
1.2545 138 0.0026 - -
1.2636 139 0.0355 - -
1.2727 140 0.0036 - -
1.2818 141 0.0036 - -
1.2909 142 0.003 - -
1.3 143 0.0043 - -
1.3091 144 0.0027 - -
1.3182 145 0.0031 - -
1.3273 146 0.0224 - -
1.3364 147 0.0013 - -
1.3455 148 0.0841 - -
1.3545 149 0.0045 - -
1.3636 150 0.0069 - -
1.3727 151 0.0048 - -
1.3818 152 0.0023 - -
1.3909 153 0.0041 - -
1.4 154 0.0106 - -
1.4091 155 0.0015 - -
1.4182 156 0.0015 - -
1.4273 157 0.001 - -
1.4364 158 0.005 - -
1.4455 159 0.0693 - -
1.4545 160 0.0921 - -
1.4636 161 0.0033 - -
1.4727 162 0.0176 - -
1.4818 163 0.0038 - -
1.4909 164 0.0024 - -
1.5 165 0.01 - -
1.5091 166 0.0009 - -
1.5182 167 0.0121 - -
1.5273 168 0.0109 - -
1.5364 169 0.0099 - -
1.5455 170 0.0079 - -
1.5545 171 0.0111 - -
1.5636 172 0.0113 - -
1.5727 173 0.0078 - -
1.5818 174 0.0023 - -
1.5909 175 0.0036 - -
1.6 176 0.0233 - -
1.6091 177 0.0188 - -
1.6182 178 0.0024 - -
1.6273 179 0.0008 - -
1.6364 180 0.0023 - -
1.6455 181 0.0008 - -
1.6545 182 0.0037 - -
1.6636 183 0.0004 - -
1.6727 184 0.0026 - -
1.6818 185 0.0045 - -
1.6909 186 0.0032 - -
1.7 187 0.0405 - -
1.7091 188 0.0146 - -
1.7182 189 0.009 - -
1.7273 190 0.0067 - -
1.7364 191 0.01 - -
1.7455 192 0.0099 - -
1.7545 193 0.1142 - -
1.7636 194 0.0213 - -
1.7727 195 0.0037 - -
1.7818 196 0.0073 - -
1.7909 197 0.0236 - -
1.8 198 0.0165 - -
1.8091 199 0.002 - -
1.8182 200 0.0016 - -
1.8273 201 0.0044 - -
1.8364 202 0.0151 - -
1.8455 203 0.0115 - -
1.8545 204 0.0023 - -
1.8636 205 0.007 - -
1.8727 206 0.0426 - -
1.8818 207 0.0019 - -
1.8909 208 0.0421 - -
1.9 209 0.0107 - -
1.9091 210 0.004 - -
1.9182 211 0.005 - -
1.9273 212 0.0038 - -
1.9364 213 0.0159 - -
1.9455 214 0.0039 - -
1.9545 215 0.0032 - -
1.9636 216 0.0038 - -
1.9727 217 0.0042 - -
1.9818 218 0.0077 - -
1.9909 219 0.0041 - -
2.0 220 0.0035 0.0231 0.7277
2.0091 221 0.0252 - -
2.0182 222 0.0031 - -
2.0273 223 0.0156 - -
2.0364 224 0.0029 - -
2.0455 225 0.0027 - -
2.0545 226 0.004 - -
2.0636 227 0.0013 - -
2.0727 228 0.0021 - -
2.0818 229 0.0616 - -
2.0909 230 0.0011 - -
2.1 231 0.0008 - -
2.1091 232 0.0014 - -
2.1182 233 0.0068 - -
2.1273 234 0.0045 - -
2.1364 235 0.0009 - -
2.1455 236 0.0009 - -
2.1545 237 0.0025 - -
2.1636 238 0.0074 - -
2.1727 239 0.0021 - -
2.1818 240 0.0021 - -
2.1909 241 0.0032 - -
2.2 242 0.0005 - -
2.2091 243 0.0025 - -
2.2182 244 0.001 - -
2.2273 245 0.0512 - -
2.2364 246 0.0015 - -
2.2455 247 0.0023 - -
2.2545 248 0.0058 - -
2.2636 249 0.0025 - -
2.2727 250 0.0016 - -
2.2818 251 0.0053 - -
2.2909 252 0.0043 - -
2.3 253 0.0015 - -
2.3091 254 0.0022 - -
2.3182 255 0.0019 - -
2.3273 256 0.001 - -
2.3364 257 0.0019 - -
2.3455 258 0.002 - -
2.3545 259 0.0037 - -
2.3636 260 0.0012 - -
2.3727 261 0.0294 - -
2.3818 262 0.0306 - -
2.3909 263 0.0014 - -
2.4 264 0.001 - -
2.4091 265 0.0013 - -
2.4182 266 0.0048 - -
2.4273 267 0.0071 - -
2.4364 268 0.0072 - -
2.4455 269 0.0031 - -
2.4545 270 0.001 - -
2.4636 271 0.0009 - -
2.4727 272 0.0014 - -
2.4818 273 0.0007 - -
2.4909 274 0.0035 - -
2.5 275 0.0124 - -
2.5091 276 0.0014 - -
2.5182 277 0.0011 - -
2.5273 278 0.0007 - -
2.5364 279 0.0055 - -
2.5455 280 0.0141 - -
2.5545 281 0.0013 - -
2.5636 282 0.0203 - -
2.5727 283 0.0026 - -
2.5818 284 0.0006 - -
2.5909 285 0.0026 - -
2.6 286 0.001 - -
2.6091 287 0.0649 - -
2.6182 288 0.0083 - -
2.6273 289 0.0011 - -
2.6364 290 0.002 - -
2.6455 291 0.0013 - -
2.6545 292 0.0086 - -
2.6636 293 0.1389 - -
2.6727 294 0.0006 - -
2.6818 295 0.0024 - -
2.6909 296 0.0013 - -
2.7 297 0.003 - -
2.7091 298 0.0016 - -
2.7182 299 0.001 - -
2.7273 300 0.0016 - -
2.7364 301 0.022 - -
2.7455 302 0.0022 - -
2.7545 303 0.0014 - -
2.7636 304 0.0013 - -
2.7727 305 0.0047 - -
2.7818 306 0.0015 - -
2.7909 307 0.0009 - -
2.8 308 0.0006 - -
2.8091 309 0.0029 - -
2.8182 310 0.0009 - -
2.8273 311 0.0098 - -
2.8364 312 0.0077 - -
2.8455 313 0.0018 - -
2.8545 314 0.0093 - -
2.8636 315 0.0072 - -
2.8727 316 0.0013 - -
2.8818 317 0.0169 - -
2.8909 318 0.0014 - -
2.9 319 0.0007 - -
2.9091 320 0.0013 - -
2.9182 321 0.001 - -
2.9273 322 0.0111 - -
2.9364 323 0.0039 - -
2.9455 324 0.0014 - -
2.9545 325 0.0013 - -
2.9636 326 0.0008 - -
2.9727 327 0.006 - -
2.9818 328 0.0006 - -
2.9909 329 0.0008 - -
3.0 330 0.0019 0.0281 0.7094
3.0091 331 0.0007 - -
3.0182 332 0.0013 - -
3.0273 333 0.0008 - -
3.0364 334 0.0033 - -
3.0455 335 0.0007 - -
3.0545 336 0.0011 - -
3.0636 337 0.0828 - -
3.0727 338 0.0023 - -
3.0818 339 0.0004 - -
3.0909 340 0.001 - -
3.1 341 0.0003 - -
3.1091 342 0.0085 - -
3.1182 343 0.0009 - -
3.1273 344 0.0162 - -
3.1364 345 0.0022 - -
3.1455 346 0.0049 - -
3.1545 347 0.0018 - -
3.1636 348 0.0017 - -
3.1727 349 0.0061 - -
3.1818 350 0.0432 - -
3.1909 351 0.0064 - -
3.2 352 0.0016 - -
3.2091 353 0.002 - -
3.2182 354 0.0157 - -
3.2273 355 0.0004 - -
3.2364 356 0.0153 - -
3.2455 357 0.0016 - -
3.2545 358 0.0009 - -
3.2636 359 0.0025 - -
3.2727 360 0.0049 - -
3.2818 361 0.0036 - -
3.2909 362 0.0024 - -
3.3 363 0.0008 - -
3.3091 364 0.0042 - -
3.3182 365 0.0014 - -
3.3273 366 0.0044 - -
3.3364 367 0.0547 - -
3.3455 368 0.0015 - -
3.3545 369 0.0072 - -
3.3636 370 0.0012 - -
3.3727 371 0.0017 - -
3.3818 372 0.0014 - -
3.3909 373 0.0038 - -
3.4 374 0.0071 - -
3.4091 375 0.0004 - -
3.4182 376 0.0007 - -
3.4273 377 0.0014 - -
3.4364 378 0.0053 - -
3.4455 379 0.0043 - -
3.4545 380 0.0017 - -
3.4636 381 0.0011 - -
3.4727 382 0.0004 - -
3.4818 383 0.0034 - -
3.4909 384 0.0016 - -
3.5 385 0.0015 - -
3.5091 386 0.0009 - -
3.5182 387 0.0045 - -
3.5273 388 0.0067 - -
3.5364 389 0.0011 - -
3.5455 390 0.0024 - -
3.5545 391 0.0025 - -
3.5636 392 0.0007 - -
3.5727 393 0.0008 - -
3.5818 394 0.0006 - -
3.5909 395 0.0036 - -
3.6 396 0.0004 - -
3.6091 397 0.001 - -
3.6182 398 0.0009 - -
3.6273 399 0.0014 - -
3.6364 400 0.0006 - -
3.6455 401 0.0053 - -
3.6545 402 0.0008 - -
3.6636 403 0.0008 - -
3.6727 404 0.0016 - -
3.6818 405 0.0029 - -
3.6909 406 0.0046 - -
3.7 407 0.0009 - -
3.7091 408 0.0107 - -
3.7182 409 0.001 - -
3.7273 410 0.005 - -
3.7364 411 0.0111 - -
3.7455 412 0.0026 - -
3.7545 413 0.0003 - -
3.7636 414 0.0074 - -
3.7727 415 0.0008 - -
3.7818 416 0.0007 - -
3.7909 417 0.0006 - -
3.8 418 0.0021 - -
3.8091 419 0.0009 - -
3.8182 420 0.0373 - -
3.8273 421 0.0009 - -
3.8364 422 0.0008 - -
3.8455 423 0.0004 - -
3.8545 424 0.0026 - -
3.8636 425 0.0022 - -
3.8727 426 0.0016 - -
3.8818 427 0.0867 - -
3.8909 428 0.002 - -
3.9 429 0.0022 - -
3.9091 430 0.013 - -
3.9182 431 0.005 - -
3.9273 432 0.0007 - -
3.9364 433 0.0079 - -
3.9455 434 0.0015 - -
3.9545 435 0.0003 - -
3.9636 436 0.0009 - -
3.9727 437 0.0005 - -
3.9818 438 0.0015 - -
3.9909 439 0.0113 - -
4.0 440 0.164 0.0230 0.7213
4.0091 441 0.0032 - -
4.0182 442 0.0006 - -
4.0273 443 0.0105 - -
4.0364 444 0.0007 - -
4.0455 445 0.0046 - -
4.0545 446 0.002 - -
4.0636 447 0.0008 - -
4.0727 448 0.0008 - -
4.0818 449 0.0019 - -
4.0909 450 0.0321 - -
4.1 451 0.0253 - -
4.1091 452 0.0008 - -
4.1182 453 0.0011 - -
4.1273 454 0.0008 - -
4.1364 455 0.0047 - -
4.1455 456 0.0004 - -
4.1545 457 0.0008 - -
4.1636 458 0.0085 - -
4.1727 459 0.0012 - -
4.1818 460 0.0013 - -
4.1909 461 0.001 - -
4.2 462 0.0008 - -
4.2091 463 0.0259 - -
4.2182 464 0.0015 - -
4.2273 465 0.0008 - -
4.2364 466 0.0043 - -
4.2455 467 0.0013 - -
4.2545 468 0.0009 - -
4.2636 469 0.002 - -
4.2727 470 0.0041 - -
4.2818 471 0.0008 - -
4.2909 472 0.0776 - -
4.3 473 0.0016 - -
4.3091 474 0.0014 - -
4.3182 475 0.0039 - -
4.3273 476 0.0299 - -
4.3364 477 0.0017 - -
4.3455 478 0.002 - -
4.3545 479 0.001 - -
4.3636 480 0.0052 - -
4.3727 481 0.0012 - -
4.3818 482 0.0005 - -
4.3909 483 0.0014 - -
4.4 484 0.0107 - -
4.4091 485 0.0014 - -
4.4182 486 0.0008 - -
4.4273 487 0.0025 - -
4.4364 488 0.0016 - -
4.4455 489 0.0006 - -
4.4545 490 0.0007 - -
4.4636 491 0.0009 - -
4.4727 492 0.0012 - -
4.4818 493 0.001 - -
4.4909 494 0.0013 - -
4.5 495 0.0015 - -
4.5091 496 0.001 - -
4.5182 497 0.0008 - -
4.5273 498 0.0008 - -
4.5364 499 0.0022 - -
4.5455 500 0.0008 - -
4.5545 501 0.0006 - -
4.5636 502 0.005 - -
4.5727 503 0.0017 - -
4.5818 504 0.0012 - -
4.5909 505 0.0021 - -
4.6 506 0.012 - -
4.6091 507 0.0021 - -
4.6182 508 0.0058 - -
4.6273 509 0.007 - -
4.6364 510 0.0006 - -
4.6455 511 0.0021 - -
4.6545 512 0.0019 - -
4.6636 513 0.0031 - -
4.6727 514 0.005 - -
4.6818 515 0.0004 - -
4.6909 516 0.0008 - -
4.7 517 0.0042 - -
4.7091 518 0.0011 - -
4.7182 519 0.0003 - -
4.7273 520 0.0007 - -
4.7364 521 0.0007 - -
4.7455 522 0.0006 - -
4.7545 523 0.0012 - -
4.7636 524 0.003 - -
4.7727 525 0.0007 - -
4.7818 526 0.0005 - -
4.7909 527 0.0012 - -
4.8 528 0.0013 - -
4.8091 529 0.0028 - -
4.8182 530 0.0004 - -
4.8273 531 0.0011 - -
4.8364 532 0.0054 - -
4.8455 533 0.0006 - -
4.8545 534 0.0352 - -
4.8636 535 0.001 - -
4.8727 536 0.0003 - -
4.8818 537 0.0017 - -
4.8909 538 0.0016 - -
4.9 539 0.0006 - -
4.9091 540 0.0007 - -
4.9182 541 0.0006 - -
4.9273 542 0.0042 - -
4.9364 543 0.0012 - -
4.9455 544 0.0016 - -
4.9545 545 0.0014 - -
4.9636 546 0.0004 - -
4.9727 547 0.0006 - -
4.9818 548 0.0023 - -
4.9909 549 0.0003 - -
5.0 550 0.0038 0.0314 0.7022
5.0091 551 0.0006 - -
5.0182 552 0.0005 - -
5.0273 553 0.0011 - -
5.0364 554 0.0024 - -
5.0455 555 0.0039 - -
5.0545 556 0.0003 - -
5.0636 557 0.001 - -
5.0727 558 0.0017 - -
5.0818 559 0.0004 - -
5.0909 560 0.0005 - -
5.1 561 0.0014 - -
5.1091 562 0.0029 - -
5.1182 563 0.0004 - -
5.1273 564 0.0004 - -
5.1364 565 0.0009 - -
5.1455 566 0.007 - -
5.1545 567 0.0018 - -
5.1636 568 0.0004 - -
5.1727 569 0.0003 - -
5.1818 570 0.0299 - -
5.1909 571 0.0008 - -
5.2 572 0.0126 - -
5.2091 573 0.0002 - -
5.2182 574 0.0303 - -
5.2273 575 0.0003 - -
5.2364 576 0.0005 - -
5.2455 577 0.0012 - -
5.2545 578 0.0009 - -
5.2636 579 0.0074 - -
5.2727 580 0.0007 - -
5.2818 581 0.0009 - -
5.2909 582 0.0007 - -
5.3 583 0.0063 - -
5.3091 584 0.0008 - -
5.3182 585 0.0009 - -
5.3273 586 0.002 - -
5.3364 587 0.0003 - -
5.3455 588 0.0026 - -
5.3545 589 0.0008 - -
5.3636 590 0.0008 - -
5.3727 591 0.0008 - -
5.3818 592 0.0011 - -
5.3909 593 0.0003 - -
5.4 594 0.0009 - -
5.4091 595 0.0017 - -
5.4182 596 0.0018 - -
5.4273 597 0.0004 - -
5.4364 598 0.001 - -
5.4455 599 0.0139 - -
5.4545 600 0.0006 - -
5.4636 601 0.001 - -
5.4727 602 0.0011 - -
5.4818 603 0.0004 - -
5.4909 604 0.0004 - -
5.5 605 0.0005 - -
5.5091 606 0.0061 - -
5.5182 607 0.0004 - -
5.5273 608 0.0005 - -
5.5364 609 0.0045 - -
5.5455 610 0.0025 - -
5.5545 611 0.0011 - -
5.5636 612 0.0019 - -
5.5727 613 0.0004 - -
5.5818 614 0.0049 - -
5.5909 615 0.0006 - -
5.6 616 0.0017 - -
5.6091 617 0.0017 - -
5.6182 618 0.0005 - -
5.6273 619 0.0003 - -
5.6364 620 0.0019 - -
5.6455 621 0.0005 - -
5.6545 622 0.0009 - -
5.6636 623 0.0049 - -
5.6727 624 0.0005 - -
5.6818 625 0.0006 - -
5.6909 626 0.0006 - -
5.7 627 0.0019 - -
5.7091 628 0.0037 - -
5.7182 629 0.0005 - -
5.7273 630 0.0009 - -
5.7364 631 0.0013 - -
5.7455 632 0.0008 - -
5.7545 633 0.0056 - -
5.7636 634 0.0007 - -
5.7727 635 0.0003 - -
5.7818 636 0.001 - -
5.7909 637 0.0009 - -
5.8 638 0.0003 - -
5.8091 639 0.0005 - -
5.8182 640 0.0006 - -
5.8273 641 0.003 - -
5.8364 642 0.0003 - -
5.8455 643 0.0016 - -
5.8545 644 0.0015 - -
5.8636 645 0.001 - -
5.8727 646 0.0006 - -
5.8818 647 0.0004 - -
5.8909 648 0.0399 - -
5.9 649 0.0004 - -
5.9091 650 0.0003 - -
5.9182 651 0.0006 - -
5.9273 652 0.0015 - -
5.9364 653 0.0012 - -
5.9455 654 0.0435 - -
5.9545 655 0.0007 - -
5.9636 656 0.0011 - -
5.9727 657 0.0019 - -
5.9818 658 0.0023 - -
5.9909 659 0.0003 - -
6.0 660 0.0004 0.0282 0.7136
6.0091 661 0.0002 - -
6.0182 662 0.0019 - -
6.0273 663 0.0004 - -
6.0364 664 0.0005 - -
6.0455 665 0.0018 - -
6.0545 666 0.0009 - -
6.0636 667 0.0098 - -
6.0727 668 0.0003 - -
6.0818 669 0.0014 - -
6.0909 670 0.001 - -
6.1 671 0.0004 - -
6.1091 672 0.0006 - -
6.1182 673 0.0085 - -
6.1273 674 0.0015 - -
6.1364 675 0.0007 - -
6.1455 676 0.0082 - -
6.1545 677 0.0047 - -
6.1636 678 0.0014 - -
6.1727 679 0.0042 - -
6.1818 680 0.0006 - -
6.1909 681 0.0013 - -
6.2 682 0.0003 - -
6.2091 683 0.0007 - -
6.2182 684 0.0007 - -
6.2273 685 0.0009 - -
6.2364 686 0.0017 - -
6.2455 687 0.0007 - -
6.2545 688 0.0017 - -
6.2636 689 0.0006 - -
6.2727 690 0.0005 - -
6.2818 691 0.0009 - -
6.2909 692 0.0275 - -
6.3 693 0.0008 - -
6.3091 694 0.0011 - -
6.3182 695 0.0012 - -
6.3273 696 0.0012 - -
6.3364 697 0.0002 - -
6.3455 698 0.0003 - -
6.3545 699 0.0009 - -
6.3636 700 0.0136 - -
6.3727 701 0.0008 - -
6.3818 702 0.0003 - -
6.3909 703 0.0046 - -
6.4 704 0.0011 - -
6.4091 705 0.0049 - -
6.4182 706 0.0005 - -
6.4273 707 0.0004 - -
6.4364 708 0.0006 - -
6.4455 709 0.0024 - -
6.4545 710 0.0108 - -
6.4636 711 0.0005 - -
6.4727 712 0.0081 - -
6.4818 713 0.0004 - -
6.4909 714 0.0015 - -
6.5 715 0.0006 - -
6.5091 716 0.0007 - -
6.5182 717 0.0019 - -
6.5273 718 0.0003 - -
6.5364 719 0.0006 - -
6.5455 720 0.0003 - -
6.5545 721 0.0003 - -
6.5636 722 0.0005 - -
6.5727 723 0.0013 - -
6.5818 724 0.001 - -
6.5909 725 0.0083 - -
6.6 726 0.0002 - -
6.6091 727 0.0006 - -
6.6182 728 0.0073 - -
6.6273 729 0.0004 - -
6.6364 730 0.0009 - -
6.6455 731 0.0004 - -
6.6545 732 0.0018 - -
6.6636 733 0.0002 - -
6.6727 734 0.0006 - -
6.6818 735 0.0007 - -
6.6909 736 0.0061 - -
6.7 737 0.001 - -
6.7091 738 0.0008 - -
6.7182 739 0.0005 - -
6.7273 740 0.0021 - -
6.7364 741 0.0012 - -
6.7455 742 0.0005 - -
6.7545 743 0.0013 - -
6.7636 744 0.0013 - -
6.7727 745 0.0003 - -
6.7818 746 0.0017 - -
6.7909 747 0.0002 - -
6.8 748 0.0004 - -
6.8091 749 0.0027 - -
6.8182 750 0.0005 - -
6.8273 751 0.0004 - -
6.8364 752 0.0005 - -
6.8455 753 0.0004 - -
6.8545 754 0.0005 - -
6.8636 755 0.001 - -
6.8727 756 0.0007 - -
6.8818 757 0.0003 - -
6.8909 758 0.001 - -
6.9 759 0.0012 - -
6.9091 760 0.0009 - -
6.9182 761 0.0025 - -
6.9273 762 0.0006 - -
6.9364 763 0.0011 - -
6.9455 764 0.0006 - -
6.9545 765 0.0005 - -
6.9636 766 0.0014 - -
6.9727 767 0.0016 - -
6.9818 768 0.0061 - -
6.9909 769 0.0111 - -
7.0 770 0.0003 0.0269 0.7149
7.0091 771 0.0004 - -
7.0182 772 0.0015 - -
7.0273 773 0.0007 - -
7.0364 774 0.0014 - -
7.0455 775 0.076 - -
7.0545 776 0.0002 - -
7.0636 777 0.0379 - -
7.0727 778 0.0002 - -
7.0818 779 0.0006 - -
7.0909 780 0.0007 - -
7.1 781 0.0017 - -
7.1091 782 0.0003 - -
7.1182 783 0.0005 - -
7.1273 784 0.0062 - -
7.1364 785 0.0012 - -
7.1455 786 0.0017 - -
7.1545 787 0.0003 - -
7.1636 788 0.0008 - -
7.1727 789 0.0008 - -
7.1818 790 0.0008 - -
7.1909 791 0.0004 - -
7.2 792 0.0003 - -
7.2091 793 0.0012 - -
7.2182 794 0.0007 - -
7.2273 795 0.0004 - -
7.2364 796 0.0024 - -
7.2455 797 0.0003 - -
7.2545 798 0.002 - -
7.2636 799 0.0068 - -
7.2727 800 0.0008 - -
7.2818 801 0.0003 - -
7.2909 802 0.0031 - -
7.3 803 0.0089 - -
7.3091 804 0.0005 - -
7.3182 805 0.0041 - -
7.3273 806 0.002 - -
7.3364 807 0.0005 - -
7.3455 808 0.0003 - -
7.3545 809 0.0005 - -
7.3636 810 0.0006 - -
7.3727 811 0.0008 - -
7.3818 812 0.0003 - -
7.3909 813 0.0014 - -
7.4 814 0.0007 - -
7.4091 815 0.0005 - -
7.4182 816 0.0015 - -
7.4273 817 0.0007 - -
7.4364 818 0.0006 - -
7.4455 819 0.022 - -
7.4545 820 0.0034 - -
7.4636 821 0.0016 - -
7.4727 822 0.0006 - -
7.4818 823 0.0061 - -
7.4909 824 0.0025 - -
7.5 825 0.0005 - -
7.5091 826 0.0018 - -
7.5182 827 0.0003 - -
7.5273 828 0.0007 - -
7.5364 829 0.0004 - -
7.5455 830 0.0006 - -
7.5545 831 0.0003 - -
7.5636 832 0.001 - -
7.5727 833 0.0007 - -
7.5818 834 0.0007 - -
7.5909 835 0.0002 - -
7.6 836 0.0633 - -
7.6091 837 0.0003 - -
7.6182 838 0.0006 - -
7.6273 839 0.0007 - -
7.6364 840 0.0007 - -
7.6455 841 0.0011 - -
7.6545 842 0.0005 - -
7.6636 843 0.0009 - -
7.6727 844 0.0002 - -
7.6818 845 0.0037 - -
7.6909 846 0.0031 - -
7.7 847 0.0005 - -
7.7091 848 0.0005 - -
7.7182 849 0.0011 - -
7.7273 850 0.0003 - -
7.7364 851 0.0002 - -
7.7455 852 0.0006 - -
7.7545 853 0.0013 - -
7.7636 854 0.0003 - -
7.7727 855 0.0013 - -
7.7818 856 0.0002 - -
7.7909 857 0.0007 - -
7.8 858 0.0002 - -
7.8091 859 0.0186 - -
7.8182 860 0.0008 - -
7.8273 861 0.0003 - -
7.8364 862 0.0012 - -
7.8455 863 0.0002 - -
7.8545 864 0.0004 - -
7.8636 865 0.0015 - -
7.8727 866 0.0038 - -
7.8818 867 0.0006 - -
7.8909 868 0.0008 - -
7.9 869 0.0007 - -
7.9091 870 0.0005 - -
7.9182 871 0.0008 - -
7.9273 872 0.0008 - -
7.9364 873 0.0007 - -
7.9455 874 0.0027 - -
7.9545 875 0.0011 - -
7.9636 876 0.0003 - -
7.9727 877 0.0007 - -
7.9818 878 0.0008 - -
7.9909 879 0.0006 - -
8.0 880 0.0017 0.0299 0.7149
8.0091 881 0.0041 - -
8.0182 882 0.0012 - -
8.0273 883 0.0019 - -
8.0364 884 0.0007 - -
8.0455 885 0.0376 - -
8.0545 886 0.0007 - -
8.0636 887 0.0005 - -
8.0727 888 0.0022 - -
8.0818 889 0.0013 - -
8.0909 890 0.0006 - -
8.1 891 0.0009 - -
8.1091 892 0.0007 - -
8.1182 893 0.0006 - -
8.1273 894 0.0019 - -
8.1364 895 0.0019 - -
8.1455 896 0.0004 - -
8.1545 897 0.0004 - -
8.1636 898 0.0004 - -
8.1727 899 0.0008 - -
8.1818 900 0.0006 - -
8.1909 901 0.0007 - -
8.2 902 0.0005 - -
8.2091 903 0.0008 - -
8.2182 904 0.0002 - -
8.2273 905 0.0003 - -
8.2364 906 0.0006 - -
8.2455 907 0.0005 - -
8.2545 908 0.0002 - -
8.2636 909 0.0024 - -
8.2727 910 0.0006 - -
8.2818 911 0.0004 - -
8.2909 912 0.0017 - -
8.3 913 0.0006 - -
8.3091 914 0.0005 - -
8.3182 915 0.0007 - -
8.3273 916 0.0007 - -
8.3364 917 0.0016 - -
8.3455 918 0.0003 - -
8.3545 919 0.0008 - -
8.3636 920 0.0006 - -
8.3727 921 0.0016 - -
8.3818 922 0.0002 - -
8.3909 923 0.0003 - -
8.4 924 0.0178 - -
8.4091 925 0.0005 - -
8.4182 926 0.0017 - -
8.4273 927 0.0007 - -
8.4364 928 0.0004 - -
8.4455 929 0.0005 - -
8.4545 930 0.0039 - -
8.4636 931 0.0002 - -
8.4727 932 0.0008 - -
8.4818 933 0.0005 - -
8.4909 934 0.0013 - -
8.5 935 0.0006 - -
8.5091 936 0.0003 - -
8.5182 937 0.0003 - -
8.5273 938 0.0006 - -
8.5364 939 0.0009 - -
8.5455 940 0.0005 - -
8.5545 941 0.0003 - -
8.5636 942 0.0006 - -
8.5727 943 0.0004 - -
8.5818 944 0.0002 - -
8.5909 945 0.0002 - -
8.6 946 0.0015 - -
8.6091 947 0.0008 - -
8.6182 948 0.0004 - -
8.6273 949 0.0006 - -
8.6364 950 0.0002 - -
8.6455 951 0.0031 - -
8.6545 952 0.0004 - -
8.6636 953 0.0006 - -
8.6727 954 0.0026 - -
8.6818 955 0.004 - -
8.6909 956 0.0006 - -
8.7 957 0.0015 - -
8.7091 958 0.002 - -
8.7182 959 0.0004 - -
8.7273 960 0.0001 - -
8.7364 961 0.0021 - -
8.7455 962 0.0038 - -
8.7545 963 0.0012 - -
8.7636 964 0.001 - -
8.7727 965 0.0014 - -
8.7818 966 0.004 - -
8.7909 967 0.0011 - -
8.8 968 0.0002 - -
8.8091 969 0.0005 - -
8.8182 970 0.0006 - -
8.8273 971 0.0005 - -
8.8364 972 0.0012 - -
8.8455 973 0.0004 - -
8.8545 974 0.0006 - -
8.8636 975 0.0006 - -
8.8727 976 0.0004 - -
8.8818 977 0.0002 - -
8.8909 978 0.0006 - -
8.9 979 0.0007 - -
8.9091 980 0.0013 - -
8.9182 981 0.001 - -
8.9273 982 0.0046 - -
8.9364 983 0.0124 - -
8.9455 984 0.0004 - -
8.9545 985 0.0005 - -
8.9636 986 0.0102 - -
8.9727 987 0.0017 - -
8.9818 988 0.0004 - -
8.9909 989 0.0014 - -
9.0 990 0.0208 0.0292 0.7135
9.0091 991 0.0004 - -
9.0182 992 0.0005 - -
9.0273 993 0.0004 - -
9.0364 994 0.0001 - -
9.0455 995 0.0008 - -
9.0545 996 0.0066 - -
9.0636 997 0.0006 - -
9.0727 998 0.0006 - -
9.0818 999 0.0002 - -
9.0909 1000 0.0006 - -
9.1 1001 0.0002 - -
9.1091 1002 0.0006 - -
9.1182 1003 0.0002 - -
9.1273 1004 0.0002 - -
9.1364 1005 0.0017 - -
9.1455 1006 0.0056 - -
9.1545 1007 0.0015 - -
9.1636 1008 0.0002 - -
9.1727 1009 0.0005 - -
9.1818 1010 0.0003 - -
9.1909 1011 0.0011 - -
9.2 1012 0.0018 - -
9.2091 1013 0.0008 - -
9.2182 1014 0.0004 - -
9.2273 1015 0.0024 - -
9.2364 1016 0.0003 - -
9.2455 1017 0.0005 - -
9.2545 1018 0.0003 - -
9.2636 1019 0.0004 - -
9.2727 1020 0.0003 - -
9.2818 1021 0.0013 - -
9.2909 1022 0.0004 - -
9.3 1023 0.0002 - -
9.3091 1024 0.0003 - -
9.3182 1025 0.0007 - -
9.3273 1026 0.0011 - -
9.3364 1027 0.0006 - -
9.3455 1028 0.0002 - -
9.3545 1029 0.1605 - -
9.3636 1030 0.0096 - -
9.3727 1031 0.0003 - -
9.3818 1032 0.0006 - -
9.3909 1033 0.0004 - -
9.4 1034 0.0014 - -
9.4091 1035 0.0004 - -
9.4182 1036 0.0006 - -
9.4273 1037 0.0616 - -
9.4364 1038 0.0359 - -
9.4455 1039 0.0003 - -
9.4545 1040 0.0008 - -
9.4636 1041 0.0004 - -
9.4727 1042 0.0004 - -
9.4818 1043 0.0005 - -
9.4909 1044 0.0001 - -
9.5 1045 0.0004 - -
9.5091 1046 0.0006 - -
9.5182 1047 0.0008 - -
9.5273 1048 0.0002 - -
9.5364 1049 0.0005 - -
9.5455 1050 0.0004 - -
9.5545 1051 0.0009 - -
9.5636 1052 0.0006 - -
9.5727 1053 0.0004 - -
9.5818 1054 0.001 - -
9.5909 1055 0.0007 - -
9.6 1056 0.0004 - -
9.6091 1057 0.0002 - -
9.6182 1058 0.0005 - -
9.6273 1059 0.0004 - -
9.6364 1060 0.0572 - -
9.6455 1061 0.0002 - -
9.6545 1062 0.0007 - -
9.6636 1063 0.0004 - -
9.6727 1064 0.0072 - -
9.6818 1065 0.0004 - -
9.6909 1066 0.0007 - -
9.7 1067 0.0007 - -
9.7091 1068 0.0008 - -
9.7182 1069 0.0003 - -
9.7273 1070 0.0005 - -
9.7364 1071 0.0019 - -
9.7455 1072 0.0059 - -
9.7545 1073 0.0012 - -
9.7636 1074 0.0011 - -
9.7727 1075 0.0004 - -
9.7818 1076 0.0006 - -
9.7909 1077 0.0009 - -
9.8 1078 0.0031 - -
9.8091 1079 0.0005 - -
9.8182 1080 0.0014 - -
9.8273 1081 0.0013 - -
9.8364 1082 0.0015 - -
9.8455 1083 0.0002 - -
9.8545 1084 0.0006 - -
9.8636 1085 0.0003 - -
9.8727 1086 0.0003 - -
9.8818 1087 0.0003 - -
9.8909 1088 0.0018 - -
9.9 1089 0.0007 - -
9.9091 1090 0.0006 - -
9.9182 1091 0.0003 - -
9.9273 1092 0.0004 - -
9.9364 1093 0.001 - -
9.9455 1094 0.0012 - -
9.9545 1095 0.0002 - -
9.9636 1096 0.0003 - -
9.9727 1097 0.0002 - -
9.9818 1098 0.0011 - -
9.9909 1099 0.0016 - -
10.0 1100 0.0005 - -

Framework Versions

  • Python: 3.12.9
  • Sentence Transformers: 4.1.0
  • Transformers: 4.52.3
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.7.0
  • Datasets: 3.6.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
5
Safetensors
Model size
335M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for cyberbabooshka/bge_large_ft1

Finetuned
(34)
this model

Evaluation results