Dataset Viewer
model
stringclasses 2
values | dataset
stringlengths 2
38
| metric
stringclasses 6
values | value
float64 0.01
0.93
⌀ |
---|---|---|---|
bert-base-uncased | AmazonCounterfactualClassification | accuracy | 0.742537 |
bert-base-uncased | AmazonPolarityClassification | accuracy | 0.713295 |
bert-base-uncased | AmazonReviewsClassification | accuracy | 0.33564 |
bert-base-uncased | Banking77Classification | accuracy | 0.634058 |
bert-base-uncased | EmotionClassification | accuracy | 0.3528 |
bert-base-uncased | ImdbClassification | accuracy | 0.653456 |
bert-base-uncased | MassiveIntentClassification | accuracy | 0.598823 |
bert-base-uncased | MassiveScenarioClassification | accuracy | 0.642771 |
bert-base-uncased | MTOPDomainClassification | accuracy | 0.826265 |
bert-base-uncased | MTOPIntentClassification | accuracy | 0.681373 |
bert-base-uncased | ToxicConversationsClassification | accuracy | 0.699968 |
bert-base-uncased | TweetSentimentExtractionClassification | accuracy | 0.518081 |
bert-base-uncased | ArxivClusteringP2P | v_measure | 0.351893 |
bert-base-uncased | ArxivClusteringS2S | v_measure | 0.275082 |
bert-base-uncased | BiorxivClusteringP2P | v_measure | 0.301228 |
bert-base-uncased | BiorxivClusteringS2S | v_measure | 0.24766 |
bert-base-uncased | MedrxivClusteringP2P | v_measure | 0.260877 |
bert-base-uncased | MedrxivClusteringS2S | v_measure | 0.236049 |
bert-base-uncased | RedditClustering | v_measure | 0.272417 |
bert-base-uncased | RedditClusteringP2P | v_measure | 0.433239 |
bert-base-uncased | StackExchangeClustering | v_measure | 0.435826 |
bert-base-uncased | StackExchangeClusteringP2P | v_measure | 0.265485 |
bert-base-uncased | TwentyNewsgroupsClustering | v_measure | 0.233543 |
bert-base-uncased | SprintDuplicateQuestions | ap | 0.368087 |
bert-base-uncased | TwitterSemEval2015 | ap | 0.558979 |
bert-base-uncased | TwitterURLCorpus | ap | 0.762873 |
bert-base-uncased | AskUbuntuDupQuestions | map | 0.458409 |
bert-base-uncased | MindSmallReranking | map | 0.283666 |
bert-base-uncased | SciDocsRR | map | 0.649373 |
bert-base-uncased | StackOverflowDupQuestions | map | 0.346155 |
bert-base-uncased | ArguAna | ndcg_at_10 | 0.28294 |
bert-base-uncased | ClimateFEVER | ndcg_at_10 | 0.0541 |
bert-base-uncased | CQADupstackRetrieval | ndcg_at_10 | 0.055066 |
bert-base-uncased | DBPedia | ndcg_at_10 | 0.04132 |
bert-base-uncased | FEVER | ndcg_at_10 | 0.033 |
bert-base-uncased | FiQA2018 | ndcg_at_10 | 0.02191 |
bert-base-uncased | HotpotQA | ndcg_at_10 | 0.0826 |
bert-base-uncased | MSMARCO | ndcg_at_10 | 0.06176 |
bert-base-uncased | NFCorpus | ndcg_at_10 | 0.04304 |
bert-base-uncased | NQ | ndcg_at_10 | 0.02615 |
bert-base-uncased | QuoraRetrieval | ndcg_at_10 | 0.61029 |
bert-base-uncased | SCIDOCS | ndcg_at_10 | 0.02815 |
bert-base-uncased | SciFact | ndcg_at_10 | 0.13339 |
bert-base-uncased | Touche2020 | ndcg_at_10 | 0.00967 |
bert-base-uncased | TRECCOVID | ndcg_at_10 | 0.14745 |
bert-base-uncased | BIOSSES | cosine_spearman | 0.546982 |
bert-base-uncased | SICK-R | cosine_spearman | 0.586451 |
bert-base-uncased | STS12 | cosine_spearman | 0.308718 |
bert-base-uncased | STS13 | cosine_spearman | 0.598949 |
bert-base-uncased | STS14 | cosine_spearman | 0.477279 |
bert-base-uncased | STS15 | cosine_spearman | 0.602857 |
bert-base-uncased | STS16 | cosine_spearman | 0.637327 |
bert-base-uncased | STS17 | cosine_spearman | 0.641002 |
bert-base-uncased | STS22 | cosine_spearman | 0.563668 |
bert-base-uncased | STSBenchmark | cosine_spearman | 0.472911 |
bert-base-uncased | SummEval | cosine_spearman | 0.298172 |
sentence-t5-xxl | AmazonCounterfactualClassification | accuracy | 0.770746 |
sentence-t5-xxl | AmazonPolarityClassification | accuracy | 0.927859 |
sentence-t5-xxl | AmazonReviewsClassification | accuracy | 0.48926 |
sentence-t5-xxl | Banking77Classification | accuracy | 0.823084 |
sentence-t5-xxl | EmotionClassification | accuracy | 0.4857 |
sentence-t5-xxl | ImdbClassification | accuracy | 0.902268 |
sentence-t5-xxl | MassiveIntentClassification | accuracy | 0.734432 |
sentence-t5-xxl | MassiveScenarioClassification | accuracy | 0.748184 |
sentence-t5-xxl | MTOPDomainClassification | accuracy | 0.924943 |
sentence-t5-xxl | MTOPIntentClassification | accuracy | 0.683265 |
sentence-t5-xxl | ToxicConversationsClassification | accuracy | 0.700366 |
sentence-t5-xxl | TweetSentimentExtractionClassification | accuracy | 0.620091 |
sentence-t5-xxl | ArxivClusteringP2P | v_measure | 0.428912 |
sentence-t5-xxl | ArxivClusteringS2S | v_measure | 0.334692 |
sentence-t5-xxl | BiorxivClusteringP2P | v_measure | 0.365276 |
sentence-t5-xxl | BiorxivClusteringS2S | v_measure | 0.286631 |
sentence-t5-xxl | MedrxivClusteringP2P | v_measure | 0.320858 |
sentence-t5-xxl | MedrxivClusteringS2S | v_measure | 0.26816 |
sentence-t5-xxl | RedditClustering | v_measure | 0.589854 |
sentence-t5-xxl | RedditClusteringP2P | v_measure | 0.644554 |
sentence-t5-xxl | StackExchangeClustering | v_measure | 0.70778 |
sentence-t5-xxl | StackExchangeClusteringP2P | v_measure | 0.352525 |
sentence-t5-xxl | TwentyNewsgroupsClustering | v_measure | 0.509302 |
sentence-t5-xxl | SprintDuplicateQuestions | ap | 0.888868 |
sentence-t5-xxl | TwitterSemEval2015 | ap | 0.802843 |
sentence-t5-xxl | TwitterURLCorpus | ap | 0.86012 |
sentence-t5-xxl | AskUbuntuDupQuestions | map | 0.661553 |
sentence-t5-xxl | MindSmallReranking | map | 0.305965 |
sentence-t5-xxl | SciDocsRR | map | 0.760941 |
sentence-t5-xxl | StackOverflowDupQuestions | map | 0.528548 |
sentence-t5-xxl | ArguAna | ndcg_at_10 | 0.39847 |
sentence-t5-xxl | ClimateFEVER | ndcg_at_10 | null |
sentence-t5-xxl | CQADupstackRetrieval | ndcg_at_10 | 0.446548 |
sentence-t5-xxl | DBPedia | ndcg_at_10 | null |
sentence-t5-xxl | FEVER | ndcg_at_10 | null |
sentence-t5-xxl | FiQA2018 | ndcg_at_10 | 0.46677 |
sentence-t5-xxl | HotpotQA | ndcg_at_10 | null |
sentence-t5-xxl | MSMARCO | ndcg_at_10 | null |
sentence-t5-xxl | NFCorpus | ndcg_at_10 | 0.35077 |
sentence-t5-xxl | NQ | ndcg_at_10 | 0.5287 |
sentence-t5-xxl | QuoraRetrieval | ndcg_at_10 | 0.85959 |
sentence-t5-xxl | SCIDOCS | ndcg_at_10 | 0.17173 |
sentence-t5-xxl | SciFact | ndcg_at_10 | 0.5538 |
sentence-t5-xxl | Touche2020 | ndcg_at_10 | 0.21647 |
End of preview. Expand
in Data Studio
README.md exists but content is empty.
- Downloads last month
- 26
Size of downloaded dataset files:
6.7 kB
Size of the auto-converted Parquet files:
6.04 kB
Number of rows:
112