CoCondenser trained on MS MARCO
This is a SPLADE Sparse Encoder model finetuned from Luyu/co-condenser-marco using the sentence-transformers library. It maps sentences & paragraphs to a 30522-dimensional sparse vector space and can be used for semantic search and sparse retrieval.
Model Details
Model Description
- Model Type: SPLADE Sparse Encoder
- Base model: Luyu/co-condenser-marco
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 30522 dimensions
- Similarity Function: Dot Product
- Language: en
- License: apache-2.0
Model Sources
- Documentation: Sentence Transformers Documentation
- Documentation: Sparse Encoder Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sparse Encoders on Hugging Face
Full Model Architecture
SparseEncoder(
(0): MLMTransformer({'max_seq_length': 512, 'do_lower_case': False}) with MLMTransformer model: BertForMaskedLM
(1): SpladePooling({'pooling_strategy': 'max', 'activation_function': 'relu', 'word_embedding_dimension': 30522})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SparseEncoder
# Download from the 🤗 Hub
model = SparseEncoder("tomaarsen/splade-cocondenser-msmarco-margin-mse-minilm")
# Run inference
queries = [
"what causes aging fast",
]
documents = [
'UV-A light, specifically, is what mainly causes tanning, skin aging, and cataracts, UV-B causes sunburn, skin aging and skin cancer, and UV-C is the strongest, and therefore most effective at killing microorganisms. Again â\x80\x93 single words and multiple bullets.',
"Answers from Ronald Petersen, M.D. Yes, Alzheimer's disease usually worsens slowly. But its speed of progression varies, depending on a person's genetic makeup, environmental factors, age at diagnosis and other medical conditions. Still, anyone diagnosed with Alzheimer's whose symptoms seem to be progressing quickly â\x80\x94 or who experiences a sudden decline â\x80\x94 should see his or her doctor.",
"Bell's palsy and Extreme tiredness and Extreme fatigue (2 causes) Bell's palsy and Extreme tiredness and Hepatitis (2 causes) Bell's palsy and Extreme tiredness and Liver pain (2 causes) Bell's palsy and Extreme tiredness and Lymph node swelling in children (2 causes)",
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 30522] [3, 30522]
# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[11.2444, 10.6804, 4.3465]])
Evaluation
Metrics
Sparse Information Retrieval
- Datasets:
NanoMSMARCO
,NanoNFCorpus
andNanoNQ
- Evaluated with
SparseInformationRetrievalEvaluator
Metric | NanoMSMARCO | NanoNFCorpus | NanoNQ |
---|---|---|---|
dot_accuracy@1 | 0.42 | 0.44 | 0.48 |
dot_accuracy@3 | 0.66 | 0.64 | 0.74 |
dot_accuracy@5 | 0.76 | 0.64 | 0.8 |
dot_accuracy@10 | 0.84 | 0.68 | 0.88 |
dot_precision@1 | 0.42 | 0.44 | 0.48 |
dot_precision@3 | 0.22 | 0.3933 | 0.2533 |
dot_precision@5 | 0.152 | 0.336 | 0.168 |
dot_precision@10 | 0.084 | 0.27 | 0.094 |
dot_recall@1 | 0.42 | 0.0439 | 0.46 |
dot_recall@3 | 0.66 | 0.0987 | 0.7 |
dot_recall@5 | 0.76 | 0.1141 | 0.76 |
dot_recall@10 | 0.84 | 0.1401 | 0.84 |
dot_ndcg@10 | 0.6312 | 0.3445 | 0.664 |
dot_mrr@10 | 0.5637 | 0.5322 | 0.6205 |
dot_map@100 | 0.5721 | 0.1566 | 0.6042 |
query_active_dims | 21.1 | 17.92 | 25.1 |
query_sparsity_ratio | 0.9993 | 0.9994 | 0.9992 |
corpus_active_dims | 157.6907 | 311.426 | 194.1861 |
corpus_sparsity_ratio | 0.9948 | 0.9898 | 0.9936 |
Sparse Nano BEIR
- Dataset:
NanoBEIR_mean
- Evaluated with
SparseNanoBEIREvaluator
with these parameters:{ "dataset_names": [ "msmarco", "nfcorpus", "nq" ] }
Metric | Value |
---|---|
dot_accuracy@1 | 0.4467 |
dot_accuracy@3 | 0.68 |
dot_accuracy@5 | 0.7333 |
dot_accuracy@10 | 0.8 |
dot_precision@1 | 0.4467 |
dot_precision@3 | 0.2889 |
dot_precision@5 | 0.2187 |
dot_precision@10 | 0.1493 |
dot_recall@1 | 0.308 |
dot_recall@3 | 0.4862 |
dot_recall@5 | 0.5447 |
dot_recall@10 | 0.6067 |
dot_ndcg@10 | 0.5466 |
dot_mrr@10 | 0.5721 |
dot_map@100 | 0.4443 |
query_active_dims | 21.3733 |
query_sparsity_ratio | 0.9993 |
corpus_active_dims | 206.6305 |
corpus_sparsity_ratio | 0.9932 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 90,000 training samples
- Columns:
query
,positive
,negative
, andscore
- Approximate statistics based on the first 1000 samples:
query positive negative score type string string string float details - min: 4 tokens
- mean: 9.22 tokens
- max: 36 tokens
- min: 15 tokens
- mean: 79.27 tokens
- max: 247 tokens
- min: 16 tokens
- mean: 81.15 tokens
- max: 201 tokens
- min: -14.32
- mean: 4.62
- max: 21.72
- Samples:
query positive negative score most powerful army in the world
U.S. Army Reserve Command You may be asking yourself, âWhat is the Army Reserve?â The Army is the most powerful and sophisticated military force in the world.
The British Royal Navy was the most powerful sea-going force by the time of World War 1 (1914-1918) and this was well-underst...
2.919867515563965
define vasomotor
Define peripheral neuropathy: a disease or degenerative state of the peripheral nerves in which motor, sensory, or vasomotor nerve fibers may be⦠a disease or degenerative state of the peripheral nerves in which motor, sensory, or vasomotor nerve fibers may be affected and which is markedâ¦
VairÄgya (Devanagari: वà¥à¤°à¤¾à¤à¥à¤¯, also spelt Vairagya) is a Sanskrit term used in Hindu philosophy that roughly translates as dispassion, detachment, or renunciation, in particular renunciation from the pains and pleasures in the material world (Maya).
3.0037026405334473
nitrates definition biology
In Botany or Plant Biology. By Photosynthesis, the palisade cells make glucose which has many uses including: storage as starch, to make fat, to make cellulose and to make protein. Glucose is converted wâ¦ith mineral slat nitrates to make the protein. Nitrates provide the essential nitrogen to make protein. The Ribosome, an organelle of the plant cell, manufactures most of the cell's protein.
Almost all inorganic nitrate salts are soluble in water at standard temperature and pressure. A common example of an inorganic nitrate salt is potassium nitrate (saltpeter). A rich source of inorganic nitrate in the human body comes from diets rich in leafy green foods, such as spinach and arugula.It is now believed that dietary nitrate in the form of plant-based foods is converted in the body to nitrite.itrate is a polyatomic ion with the molecular formula NO 3 â and a molecular mass of 62.0049 g/mol.
-1.6804794073104858
- Loss:
SpladeLoss
with these parameters:{ "loss": "SparseMarginMSELoss", "lambda_corpus": 0.08, "lambda_query": 0.1 }
Evaluation Dataset
Unnamed Dataset
- Size: 10,000 evaluation samples
- Columns:
query
,positive
,negative
, andscore
- Approximate statistics based on the first 1000 samples:
query positive negative score type string string string float details - min: 4 tokens
- mean: 9.01 tokens
- max: 35 tokens
- min: 17 tokens
- mean: 79.8 tokens
- max: 336 tokens
- min: 18 tokens
- mean: 81.3 tokens
- max: 273 tokens
- min: -15.9
- mean: 4.91
- max: 21.67
- Samples:
query positive negative score femoral artery definition
medical Definition of circumflex artery : any of several paired curving arteries: as a: either of two arteries that branch from the deep femoral artery or from the femoral artery itself:
Femoral vein. The femoral vein is located in the upper thigh and pelvic region of the human body. It travels in close proximity to the femoral artery. This vein is one of the larger vessels in the venous system. Instead of draining deoxygenated blood from specific parts of the body, it receives blood from several significant branches. These include popliteal, the profunda femoris, and the great sapheneous veins.
-0.1968388557434082
what causes mastitis and how do you treat it
Mastitis is an infection of the tissue of the breast that occurs most frequently during the time of breastfeeding. This infection causes pain, swelling, redness, and increased temperature of the breast. It can occur when bacteria, often from the infant's mouth, enter a milk duct through a crack in the nipple. This causes an infection and painful inflammation of the breast.
Common causes of mastitis include bacteria from the babyâs mouth, bacteria entering via breast injuries (bruising, fissures, cracks in the nipple), milk stasis (milk pooling in the breast), and bacteria from the hands of the mother or health care provider.
-0.8143405914306641
what is a buck moth
Buck moth caterpillars that have a light background color can be confused with both the Nevada buck moth, Hemileuca nevadensis Stretch, and the New England buck moth, Hemileuca lucina Henry Edwards. The larvae of these three species can best be distinguished based on the preferred host plants (Wagner 2005).hey rely on resources that are acquired by the caterpillars (larvae). The caterpillars are robust and can exceed four inches (10 cm) in North America. Figure 4. Adult cecropia moth, Hyalophora cecropia (Linnaeus). Photograph by Pennsylvania Department of Conservation and Natural Resources-Forestry Archive, Bugwood.org.
bucktail that gets talked about quietly in the . privacy of remote cabins. The âMusky-Teerâ is a big fish bait that anglers treasure in their collection. You wonât find these at your local bait shop but weâve been stocking these highly prized baits in all colors for years.
11.004357814788818
- Loss:
SpladeLoss
with these parameters:{ "loss": "SparseMarginMSELoss", "lambda_corpus": 0.08, "lambda_query": 0.1 }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 16per_device_eval_batch_size
: 16learning_rate
: 2e-05num_train_epochs
: 1warmup_ratio
: 0.1fp16
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 16per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportionalrouter_mapping
: {}learning_rate_mapping
: {}
Training Logs
Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_dot_ndcg@10 | NanoNFCorpus_dot_ndcg@10 | NanoNQ_dot_ndcg@10 | NanoBEIR_mean_dot_ndcg@10 |
---|---|---|---|---|---|---|---|
0.0178 | 100 | 501776.8 | - | - | - | - | - |
0.0356 | 200 | 9740.8356 | - | - | - | - | - |
0.0533 | 300 | 61.9771 | - | - | - | - | - |
0.0711 | 400 | 37.6145 | - | - | - | - | - |
0.0889 | 500 | 28.8887 | 24.4953 | 0.4878 | 0.3047 | 0.5425 | 0.4450 |
0.1067 | 600 | 24.7991 | - | - | - | - | - |
0.1244 | 700 | 22.1517 | - | - | - | - | - |
0.1422 | 800 | 22.0889 | - | - | - | - | - |
0.16 | 900 | 20.7825 | - | - | - | - | - |
0.1778 | 1000 | 20.0856 | 18.6383 | 0.5751 | 0.3303 | 0.6100 | 0.5051 |
0.1956 | 1100 | 18.6968 | - | - | - | - | - |
0.2133 | 1200 | 20.5069 | - | - | - | - | - |
0.2311 | 1300 | 19.8162 | - | - | - | - | - |
0.2489 | 1400 | 19.1892 | - | - | - | - | - |
0.2667 | 1500 | 17.5024 | 18.0698 | 0.5750 | 0.3281 | 0.6222 | 0.5084 |
0.2844 | 1600 | 17.7801 | - | - | - | - | - |
0.3022 | 1700 | 17.9045 | - | - | - | - | - |
0.32 | 1800 | 16.3731 | - | - | - | - | - |
0.3378 | 1900 | 16.293 | - | - | - | - | - |
0.3556 | 2000 | 16.1167 | 14.5428 | 0.5696 | 0.3422 | 0.6232 | 0.5116 |
0.3733 | 2100 | 16.561 | - | - | - | - | - |
0.3911 | 2200 | 16.5533 | - | - | - | - | - |
0.4089 | 2300 | 14.9371 | - | - | - | - | - |
0.4267 | 2400 | 15.565 | - | - | - | - | - |
0.4444 | 2500 | 14.2143 | 15.2027 | 0.6071 | 0.3376 | 0.6600 | 0.5349 |
0.4622 | 2600 | 13.7188 | - | - | - | - | - |
0.48 | 2700 | 14.8554 | - | - | - | - | - |
0.4978 | 2800 | 15.1021 | - | - | - | - | - |
0.5156 | 2900 | 13.3032 | - | - | - | - | - |
0.5333 | 3000 | 13.8999 | 12.9609 | 0.5874 | 0.3423 | 0.6562 | 0.5286 |
0.5511 | 3100 | 12.7418 | - | - | - | - | - |
0.5689 | 3200 | 12.9422 | - | - | - | - | - |
0.5867 | 3300 | 13.6937 | - | - | - | - | - |
0.6044 | 3400 | 13.1183 | - | - | - | - | - |
0.6222 | 3500 | 12.7998 | 12.2024 | 0.6262 | 0.3424 | 0.6771 | 0.5486 |
0.64 | 3600 | 12.7799 | - | - | - | - | - |
0.6578 | 3700 | 12.2294 | - | - | - | - | - |
0.6756 | 3800 | 13.6836 | - | - | - | - | - |
0.6933 | 3900 | 13.579 | - | - | - | - | - |
0.7111 | 4000 | 12.6337 | 13.9878 | 0.6156 | 0.3435 | 0.6526 | 0.5372 |
0.7289 | 4100 | 12.682 | - | - | - | - | - |
0.7467 | 4200 | 12.2157 | - | - | - | - | - |
0.7644 | 4300 | 12.3127 | - | - | - | - | - |
0.7822 | 4400 | 11.7435 | - | - | - | - | - |
0.8 | 4500 | 12.086 | 12.3685 | 0.6262 | 0.3386 | 0.6782 | 0.5477 |
0.8178 | 4600 | 12.5455 | - | - | - | - | - |
0.8356 | 4700 | 11.7477 | - | - | - | - | - |
0.8533 | 4800 | 11.9948 | - | - | - | - | - |
0.8711 | 4900 | 11.8997 | - | - | - | - | - |
0.8889 | 5000 | 12.1624 | 12.8277 | 0.6241 | 0.3515 | 0.6740 | 0.5499 |
0.9067 | 5100 | 11.4352 | - | - | - | - | - |
0.9244 | 5200 | 10.9171 | - | - | - | - | - |
0.9422 | 5300 | 11.3242 | - | - | - | - | - |
0.96 | 5400 | 11.437 | - | - | - | - | - |
0.9778 | 5500 | 11.3141 | 11.6410 | 0.6366 | 0.3441 | 0.6605 | 0.5471 |
0.9956 | 5600 | 11.8683 | - | - | - | - | - |
-1 | -1 | - | - | 0.6312 | 0.3445 | 0.6640 | 0.5466 |
Environmental Impact
Carbon emissions were measured using CodeCarbon.
- Energy Consumed: 0.225 kWh
- Carbon Emitted: 0.088 kg of CO2
- Hours Used: 0.653 hours
Training Hardware
- On Cloud: No
- GPU Model: 1 x NVIDIA GeForce RTX 3090
- CPU Model: 13th Gen Intel(R) Core(TM) i7-13700K
- RAM Size: 31.78 GB
Framework Versions
- Python: 3.11.6
- Sentence Transformers: 4.2.0.dev0
- Transformers: 4.52.4
- PyTorch: 2.6.0+cu124
- Accelerate: 1.5.1
- Datasets: 2.21.0
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
SpladeLoss
@misc{formal2022distillationhardnegativesampling,
title={From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective},
author={Thibault Formal and Carlos Lassance and Benjamin Piwowarski and Stéphane Clinchant},
year={2022},
eprint={2205.04733},
archivePrefix={arXiv},
primaryClass={cs.IR},
url={https://arxiv.org/abs/2205.04733},
}
SparseMarginMSELoss
@misc{hofstätter2021improving,
title={Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation},
author={Sebastian Hofstätter and Sophia Althammer and Michael Schröder and Mete Sertkan and Allan Hanbury},
year={2021},
eprint={2010.02666},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
FlopsLoss
@article{paria2020minimizing,
title={Minimizing flops to learn efficient sparse representations},
author={Paria, Biswajit and Yeh, Chih-Kuan and Yen, Ian EH and Xu, Ning and Ravikumar, Pradeep and P{'o}czos, Barnab{'a}s},
journal={arXiv preprint arXiv:2004.05665},
year={2020}
}
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for tomaarsen/splade-cocondenser-msmarco-margin-mse-minilm-small-old
Base model
Luyu/co-condenser-marcoEvaluation results
- Dot Accuracy@1 on NanoMSMARCOself-reported0.420
- Dot Accuracy@3 on NanoMSMARCOself-reported0.660
- Dot Accuracy@5 on NanoMSMARCOself-reported0.760
- Dot Accuracy@10 on NanoMSMARCOself-reported0.840
- Dot Precision@1 on NanoMSMARCOself-reported0.420
- Dot Precision@3 on NanoMSMARCOself-reported0.220
- Dot Precision@5 on NanoMSMARCOself-reported0.152
- Dot Precision@10 on NanoMSMARCOself-reported0.084
- Dot Recall@1 on NanoMSMARCOself-reported0.420
- Dot Recall@3 on NanoMSMARCOself-reported0.660