SentenceTransformer based on google/embeddinggemma-300m
This is a sentence-transformers model finetuned from google/embeddinggemma-300m. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: google/embeddinggemma-300m
- Maximum Sequence Length: 2048 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 2048, 'do_lower_case': False, 'architecture': 'Gemma3TextModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Dense({'in_features': 768, 'out_features': 3072, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
(3): Dense({'in_features': 3072, 'out_features': 768, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
(4): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the ๐ค Hub
model = SentenceTransformer("kevinadityai/gemma-ai-faq-embeddings-full")
# Run inference
queries = [
"Fasilitas apa yang dimiliki pusat keunggulan Rumah Sakit?",
]
documents = [
'Kardiologi: Di Cardiac Center kami terdapat Unit Perawatan Jantung (CCU) yang berfokus pada perawatan pasien setelah serangan jantung atau operasi jantung, CT-Scan multi-irisan, Ekokardiografi, Elektrokardiogram (ECG), Pengobatan nuklir, Cath Lab, dan lainnya. Ilmu Saraf: Pusat Ilmu Saraf Siloam menawarkan perawatan lanjutan untuk operasi stereotaktik radiasi menggunakan pisau Gamma dan perawatan untuk Hydrocephalus. Onkologi: Pusat onkologi kami menawarkan perawatan seperti operasi radio pisau gamma, kedokteran nuklir dengan PET-CT dan SPECT-CT Scan, terapi radiasi dengan Rapid Arc Linear Accelerator (LINAC), terapi radionuklida, dan lainnya. Ortopedi: Pusat Keunggulan Siloam dalam Ortopedi menyediakan diagnosis, perawatan, dan rehabilitasi ahli untuk gangguan tulang, sendi, atau jaringan ikat. Meliputi pencegahan patah tulang osteoporosis, Bone Mass Densitometry dan Frax, diagnosa cedera atau penyakit kompleks, CT Scan 2D/3D, 1,5 Tesla dan 3 Tesla MRI, artroplasti revisi kompleks di pinggul dan lutut, replacement surgery, bedah invasif minimal, pusat rehabilitasi lengkap, dan lainnya. Urologi: Layanan medis di Pusat Urologi kami meliputi operasi laser lampu hijau, Extra Corporeal Shock Wave Lithotripsy (ESWL), rekonstruksi urologi, vasektomi, dan lainnya. Keadaan Darurat: Pusat Darurat dan Trauma Siloam menyediakan perawatan medis darurat berkualitas tinggi, termasuk spesialis yang bisa dipanggil 24 jam, layanan diagnostik (X-ray, CT Scan, MRI) berdekatan dengan Unit Gawat Darurat, layanan darurat khusus seperti perawatan kardiovaskular akut, manajemen stroke akut, bedah saraf, bedah ortopedi, layanan darurat anak, perawatan trauma, serta ambulans darat dan udara lengkap.',
'Siloam menyediakan paket pemeriksaan MCU / Medical Check Up yang beragam. Anda dapat menggunakan fitur cek berdasarkan gejala pada website kami untuk mengetahui rekomendasi paket yang paling sesuai untuk Anda. Untuk informasi lebih lanjut, silahkan kunjungi website kami siloamhospitals.com/mcu.',
'Fasilitas yang disediakan berbeda-beda sesuai dengan jenis kamar dan rumah sakit. Umumnya, setiap kamar mempunyai kulkas mini, televisi, telepon, piyama untuk pasien, selimut, perlengkapan mandi, dan tisu. Mohon menghubungi perwakilan perawatan kami untuk mengetahui fasilitas khusus yang Anda butuhkan di kamar Anda.',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 768] [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[0.4240, 0.4052, 0.5881]])
Evaluation
Metrics
Information Retrieval
- Dataset:
faq-gemma
- Evaluated with
InformationRetrievalEvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.4507 |
cosine_accuracy@3 | 0.6053 |
cosine_accuracy@5 | 0.6217 |
cosine_accuracy@10 | 0.6447 |
cosine_precision@1 | 0.4507 |
cosine_precision@3 | 0.2018 |
cosine_precision@5 | 0.1243 |
cosine_precision@10 | 0.0645 |
cosine_recall@1 | 0.4507 |
cosine_recall@3 | 0.6053 |
cosine_recall@5 | 0.6217 |
cosine_recall@10 | 0.6447 |
cosine_ndcg@10 | 0.5581 |
cosine_mrr@10 | 0.5293 |
cosine_map@100 | 0.5329 |
Information Retrieval
- Dataset:
faq-gemma
- Evaluated with
InformationRetrievalEvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.5263 |
cosine_accuracy@3 | 0.6678 |
cosine_accuracy@5 | 0.6974 |
cosine_accuracy@10 | 0.7336 |
cosine_precision@1 | 0.5263 |
cosine_precision@3 | 0.2226 |
cosine_precision@5 | 0.1395 |
cosine_precision@10 | 0.0734 |
cosine_recall@1 | 0.5263 |
cosine_recall@3 | 0.6678 |
cosine_recall@5 | 0.6974 |
cosine_recall@10 | 0.7336 |
cosine_ndcg@10 | 0.6355 |
cosine_mrr@10 | 0.6036 |
cosine_map@100 | 0.6074 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 304 training samples
- Columns:
query
andanswer_positive
- Approximate statistics based on the first 304 samples:
query answer_positive type string string details - min: 3 tokens
- mean: 12.51 tokens
- max: 44 tokens
- min: 9 tokens
- mean: 35.49 tokens
- max: 409 tokens
- Samples:
query answer_positive Di mana ada lokasi Rumah Sakit Siloam?
Ada 41 Rumah Sakit modern yang terdiri dari 14 Rumah Sakit di Jabodetabek dan 27 rumah sakit yang tersebar di Jawa, Sumatera, Kalimantan, Sulawesi, serta Bali dan Nusa Tenggara.
Apa jenis kamar rawat inap yang tersedia?
Siloam Hospitals menawarkan banyak pilihan jenis kamar rawat inap. Silahkan pilih rumah sakit yang akan Anda kunjungi untuk mengetahui jenis kamar rawat inap yang ditawarkan di setiap unit.
Apa standar keamanan suplai darah yang diambil di rumah sakit?
Untuk memastikan keamanan dan kualitas suplai darah, kami secara eksklusif menerima darah dari Palang Merah Indonesia.
- Loss:
CachedMultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim", "mini_batch_size": 8, "gather_across_devices": false }
Evaluation Dataset
Unnamed Dataset
- Size: 30 evaluation samples
- Columns:
query
andanswer_positive
- Approximate statistics based on the first 30 samples:
query answer_positive type string string details - min: 10 tokens
- mean: 15.77 tokens
- max: 44 tokens
- min: 22 tokens
- mean: 57.0 tokens
- max: 409 tokens
- Samples:
query answer_positive Di mana ada lokasi Rumah Sakit Siloam?
Ada 41 Rumah Sakit modern yang terdiri dari 14 Rumah Sakit di Jabodetabek dan 27 rumah sakit yang tersebar di Jawa, Sumatera, Kalimantan, Sulawesi, serta Bali dan Nusa Tenggara.
Apa jenis kamar rawat inap yang tersedia?
Siloam Hospitals menawarkan banyak pilihan jenis kamar rawat inap. Silahkan pilih rumah sakit yang akan Anda kunjungi untuk mengetahui jenis kamar rawat inap yang ditawarkan di setiap unit.
Apa standar keamanan suplai darah yang diambil di rumah sakit?
Untuk memastikan keamanan dan kualitas suplai darah, kami secara eksklusif menerima darah dari Palang Merah Indonesia.
- Loss:
CachedMultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim", "mini_batch_size": 8, "gather_across_devices": false }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 128per_device_eval_batch_size
: 128learning_rate
: 2e-05num_train_epochs
: 1warmup_ratio
: 0.1prompts
: {'question': 'task: search result | query: ', 'passage_text': 'title: none | text: '}batch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 128per_device_eval_batch_size
: 128per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}parallelism_config
: Nonedeepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torch_fusedoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsehub_revision
: Nonegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseliger_kernel_config
: Noneeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: {'question': 'task: search result | query: ', 'passage_text': 'title: none | text: '}batch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportionalrouter_mapping
: {}learning_rate_mapping
: {}
Training Logs
Epoch | Step | faq-gemma_cosine_ndcg@10 |
---|---|---|
-1 | -1 | 0.6355 |
Framework Versions
- Python: 3.12.10
- Sentence Transformers: 5.1.1
- Transformers: 4.56.2
- PyTorch: 2.8.0+cpu
- Accelerate: 1.10.1
- Datasets: 4.1.1
- Tokenizers: 0.22.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
CachedMultipleNegativesRankingLoss
@misc{gao2021scaling,
title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
year={2021},
eprint={2101.06983},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
- Downloads last month
- 17
Model tree for kevinadityai/gemma-ai-faq-embeddings-full
Base model
google/embeddinggemma-300mEvaluation results
- Cosine Accuracy@1 on faq gemmaself-reported0.451
- Cosine Accuracy@3 on faq gemmaself-reported0.605
- Cosine Accuracy@5 on faq gemmaself-reported0.622
- Cosine Accuracy@10 on faq gemmaself-reported0.645
- Cosine Precision@1 on faq gemmaself-reported0.451
- Cosine Precision@3 on faq gemmaself-reported0.202
- Cosine Precision@5 on faq gemmaself-reported0.124
- Cosine Precision@10 on faq gemmaself-reported0.064
- Cosine Recall@1 on faq gemmaself-reported0.451
- Cosine Recall@3 on faq gemmaself-reported0.605