Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
bobox
/
DeBERTaV3-small-GeneralSentenceTransformer-v2-AllSoft
like
0
Sentence Similarity
sentence-transformers
PyTorch
15 datasets
English
deberta-v2
feature-extraction
Generated from Trainer
dataset_size:78183
loss:AdaptiveLayerLoss
loss:CoSENTLoss
loss:GISTEmbedLoss
loss:OnlineContrastiveLoss
loss:MultipleNegativesSymmetricRankingLoss
Eval Results
Inference Endpoints
arxiv:
1908.10084
arxiv:
2402.14776
arxiv:
2402.16829
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
DeBERTaV3-small-GeneralSentenceTransformer-v2-AllSoft
1 contributor
History:
5 commits
bobox
Update README.md
110436b
verified
5 months ago
1_Pooling
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago
.gitattributes
Safe
1.52 kB
initial commit
5 months ago
README.md
Safe
135 kB
Update README.md
5 months ago
added_tokens.json
Safe
23 Bytes
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago
config.json
Safe
860 Bytes
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago
config_sentence_transformers.json
Safe
195 Bytes
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago
modules.json
Safe
229 Bytes
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
480 MB
LFS
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago
sentence_bert_config.json
Safe
53 Bytes
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago
special_tokens_map.json
Safe
286 Bytes
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago
spm.model
Safe
2.46 MB
LFS
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago
tokenizer.json
Safe
8.66 MB
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago
tokenizer_config.json
Safe
1.28 kB
all layer trained for every step.AdaptiveLayerLoss(model=model,
5 months ago