SentenceTransformer based on embaas/sentence-transformers-e5-large-v2
This is a sentence-transformers model finetuned from embaas/sentence-transformers-e5-large-v2. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: embaas/sentence-transformers-e5-large-v2
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: PeftModelForFeatureExtraction
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Model tree for duckduckpuck/sir-sbert-e5-large-v1
Base model
embaas/sentence-transformers-e5-large-v2Evaluation results
- Cosine Accuracy@1 on eval splitself-reported0.923
- Cosine Accuracy@3 on eval splitself-reported0.987
- Cosine Accuracy@5 on eval splitself-reported0.991
- Cosine Accuracy@10 on eval splitself-reported0.995
- Cosine Precision@1 on eval splitself-reported0.923
- Cosine Precision@3 on eval splitself-reported0.329
- Cosine Precision@5 on eval splitself-reported0.198
- Cosine Precision@10 on eval splitself-reported0.099
- Cosine Recall@1 on eval splitself-reported0.923
- Cosine Recall@3 on eval splitself-reported0.987