model_type
stringclasses
5 values
model
stringlengths
12
62
AVG
float64
0.03
0.7
CG
float64
0
0.68
EL
float64
0
0.77
FA
float64
0
0.62
HE
float64
0
0.83
MC
float64
0
0.95
MR
float64
0
0.95
MT
float64
0.19
0.86
NLI
float64
0
0.97
QA
float64
0
0.77
RC
float64
0
0.94
SUM
float64
0
0.29
aio_char_f1
float64
0
0.9
alt-e-to-j_bert_score_ja_f1
float64
0
0.88
alt-e-to-j_bleu_ja
float64
0
16
alt-e-to-j_comet_wmt22
float64
0.2
0.92
alt-j-to-e_bert_score_en_f1
float64
0
0.96
alt-j-to-e_bleu_en
float64
0
20.1
alt-j-to-e_comet_wmt22
float64
0.17
0.89
chabsa_set_f1
float64
0
0.77
commonsensemoralja_exact_match
float64
0
0.94
jamp_exact_match
float64
0
1
janli_exact_match
float64
0
1
jcommonsenseqa_exact_match
float64
0
0.98
jemhopqa_char_f1
float64
0
0.71
jmmlu_exact_match
float64
0
0.81
jnli_exact_match
float64
0
0.94
jsem_exact_match
float64
0
0.96
jsick_exact_match
float64
0
0.93
jsquad_char_f1
float64
0
0.94
jsts_pearson
float64
-0.35
0.94
jsts_spearman
float64
-0.6
0.91
kuci_exact_match
float64
0
0.93
mawps_exact_match
float64
0
0.95
mbpp_code_exec
float64
0
0.68
mbpp_pylint_check
float64
0
0.99
mmlu_en_exact_match
float64
0
0.86
niilc_char_f1
float64
0
0.7
wiki_coreference_set_f1
float64
0
0.4
wiki_dependency_set_f1
float64
0
0.88
wiki_ner_set_f1
float64
0
0.33
wiki_pas_set_f1
float64
0
0.57
wiki_reading_char_f1
float64
0
0.94
wikicorpus-e-to-j_bert_score_ja_f1
float64
0
0.88
wikicorpus-e-to-j_bleu_ja
float64
0
24
wikicorpus-e-to-j_comet_wmt22
float64
0.18
0.87
wikicorpus-j-to-e_bert_score_en_f1
float64
0
0.93
wikicorpus-j-to-e_bleu_en
float64
0
15.9
wikicorpus-j-to-e_comet_wmt22
float64
0.17
0.79
xlsum_ja_bert_score_ja_f1
float64
0
0.79
xlsum_ja_bleu_ja
float64
0
10.2
xlsum_ja_rouge1
float64
0
52.8
xlsum_ja_rouge2
float64
0
29.2
xlsum_ja_rouge2_scaling
float64
0
0.29
xlsum_ja_rougeLsum
float64
0
44.9
architecture
stringclasses
12 values
precision
stringclasses
3 values
license
stringclasses
14 values
params
float64
0
70.6
likes
int64
0
6.19k
revision
stringclasses
1 value
num_few_shot
int64
0
4
add_special_tokens
stringclasses
2 values
llm_jp_eval_version
stringclasses
1 value
vllm_version
stringclasses
1 value
🀝 : base merges and moerges
Steelskull/L3.3-MS-Nevoria-70b
0.376
0.004
0.2198
0.128
0.2761
0.7965
0.004
0.8297
0.7127
0.3552
0.7166
0.093
0.4227
0.8575
12.3766
0.9013
0.9526
17.3647
0.8821
0.2198
0.893
0.6063
0.7819
0.8508
0.2848
0.1926
0.5892
0.791
0.795
0.7166
0.873
0.8481
0.6457
0.004
0.004
0.008
0.3596
0.3581
0
0.0322
0.0177
0
0.59
0.8102
11.1841
0.8086
0.8912
10.7909
0.7266
0.686
3.0961
21.6755
9.2982
0.093
19.3764
LlamaForCausalLM
bfloat16
llama3.3
70.554
26
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
Steelskull/L3.3-MS-Nevoria-70b
0.6097
0.004
0.5668
0.286
0.7679
0.8721
0.932
0.8501
0.7713
0.6448
0.9191
0.093
0.7115
0.8643
13.241
0.908
0.9564
17.6476
0.8859
0.5668
0.9063
0.6293
0.8819
0.9383
0.6594
0.7235
0.7794
0.798
0.7678
0.9191
0.8859
0.8482
0.7715
0.932
0.004
0.008
0.8122
0.5633
0.0681
0.3587
0.115
0.0443
0.8441
0.8491
15.5805
0.8471
0.9093
12.2107
0.7592
0.686
3.0961
21.6755
9.2982
0.093
19.3764
LlamaForCausalLM
bfloat16
llama3.3
70.554
26
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
0.0586
0
0
0.0146
0.0576
0.0284
0.004
0.3537
0.0189
0.0504
0.1079
0.0094
0.0352
0.5882
0.7584
0.3567
0.7837
3.2584
0.4069
0
0.0183
0.0086
0.0056
0.0527
0.0669
0.0031
0
0.0745
0.0059
0.1079
-0.0302
-0.0385
0.0141
0.004
0
0.002
0.1122
0.0491
0
0
0
0
0.0729
0.5585
0.6304
0.324
0.7589
2.2652
0.3274
0.5325
0.43
4.8434
0.9445
0.0094
4.0182
Qwen2ForCausalLM
bfloat16
mit
1.777
317
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
0.2584
0
0.2205
0.0341
0.2912
0.3551
0.45
0.4364
0.4516
0.153
0.4413
0.0094
0.0634
0.6069
1.6655
0.3812
0.8571
7.6125
0.5894
0.2205
0.4649
0.3333
0.4722
0.3253
0.2969
0.2663
0.629
0.6604
0.163
0.4413
0.2962
0.3143
0.2752
0.45
0
0.002
0.3161
0.0986
0
0.0147
0.0177
0.0031
0.1351
0.5712
1.5419
0.3478
0.7834
4.6722
0.4274
0.5325
0.43
4.8434
0.9445
0.0094
4.0182
Qwen2ForCausalLM
bfloat16
mit
1.777
317
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B
0.1383
0.008
0.0021
0.0335
0.0593
0.3159
0.002
0.5614
0.0158
0.0841
0.4148
0.0243
0.0401
0.6783
2.357
0.4935
0.8875
7.4002
0.7125
0.0021
0.4259
0
0
0.3405
0.1414
0.0065
0.0058
0.0682
0.0049
0.4148
0.0132
0.0367
0.1813
0.002
0.008
0.008
0.112
0.0707
0
0
0
0
0.1673
0.6377
2.58
0.4739
0.8321
4.8825
0.5656
0.5836
0.7515
10.3424
2.4406
0.0243
8.2865
Qwen2ForCausalLM
bfloat16
mit
7.616
165
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B
0.3919
0.008
0.3165
0.0519
0.4668
0.5649
0.716
0.6355
0.5975
0.1942
0.7346
0.0243
0.1072
0.73
3.7524
0.5949
0.9134
10.8855
0.7904
0.3165
0.6804
0.4483
0.6069
0.5853
0.3207
0.4219
0.5707
0.6932
0.6686
0.7346
0.7036
0.6312
0.4291
0.716
0.008
0.008
0.5118
0.1548
0.0075
0.0677
0.0354
0.0093
0.1396
0.6651
3.8939
0.5152
0.8616
7.3599
0.6413
0.5836
0.7515
10.3424
2.4406
0.0243
8.2865
Qwen2ForCausalLM
bfloat16
mit
7.616
165
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
0.414
0
0.3968
0.1202
0.4458
0.5695
0.528
0.7955
0.6011
0.2284
0.8021
0.0661
0.2366
0.8215
8.3346
0.8585
0.9369
13.6565
0.8526
0.3968
0.7427
0.5057
0.5208
0.6238
0.2062
0.3784
0.6935
0.6989
0.5868
0.8021
0.7226
0.7263
0.342
0.528
0
0
0.5132
0.2425
0.0153
0.204
0.0619
0.0415
0.2782
0.7611
7.083
0.752
0.8874
8.888
0.7189
0.6556
1.6419
19.3155
6.6108
0.0661
15.8332
LlamaForCausalLM
bfloat16
mit
8.03
180
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
0.1254
0
0.0109
0.0295
0
0.1344
0
0.6277
0.0756
0.0889
0.346
0.0661
0.0503
0.6832
4.3784
0.6067
0.8849
8.6006
0.7143
0.0109
0
0.1006
0
0.168
0.1712
0
0.0781
0
0.1995
0.346
0.0968
0.0972
0.2353
0
0
0
0
0.0454
0
0
0
0
0.1475
0.657
3.2113
0.5775
0.8438
5.2241
0.6121
0.6556
1.6419
19.3155
6.6108
0.0661
15.8332
LlamaForCausalLM
bfloat16
mit
8.03
180
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
0.2159
0.0602
0.2253
0.052
0.0076
0.3259
0
0.7427
0.2138
0.1498
0.4906
0.107
0.1055
0.7854
7.071
0.7958
0.913
10.8002
0.8082
0.2253
0.0135
0.4425
0
0.4549
0.1598
0
0.3139
0
0.3126
0.4906
0.805
0.8172
0.5093
0
0.0602
0.1606
0.0152
0.1842
0
0.0075
0.0177
0
0.2349
0.7329
6.2705
0.7001
0.861
7.1496
0.6668
0.6981
2.3878
30.0099
10.6979
0.107
21.9762
Qwen2ForCausalLM
bfloat16
mit
14.77
165
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
0.5683
0.0602
0.5824
0.2182
0.6743
0.8426
0.82
0.829
0.7597
0.4593
0.8984
0.107
0.4395
0.8466
10.7019
0.8921
0.9486
15.2447
0.8762
0.5824
0.86
0.6063
0.7736
0.9374
0.5082
0.634
0.8237
0.7683
0.8263
0.8984
0.8729
0.8417
0.7304
0.82
0.0602
0.1606
0.7145
0.4304
0.0218
0.2991
0.0973
0.0419
0.6307
0.7977
8.903
0.805
0.8961
9.9181
0.7426
0.6981
2.3878
30.0099
10.6979
0.107
21.9762
Qwen2ForCausalLM
bfloat16
mit
14.77
165
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-32B
0.5934
0.01
0.5687
0.2596
0.7506
0.884
0.924
0.8438
0.7692
0.522
0.8917
0.1035
0.5201
0.8596
11.7312
0.9057
0.9531
16.7936
0.8839
0.5687
0.8963
0.6667
0.7542
0.9526
0.5339
0.7224
0.8558
0.7942
0.7751
0.8917
0.9015
0.8757
0.803
0.924
0.01
0.0422
0.7789
0.512
0.0147
0.3592
0.1416
0.0281
0.7545
0.8199
10.3248
0.8273
0.9042
10.8877
0.7582
0.693
2.3423
28.412
10.3477
0.1035
21.279
Qwen2ForCausalLM
bfloat16
mit
32.764
457
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Qwen-32B
0.2811
0.01
0.2257
0.1174
0.0001
0.7451
0
0.7844
0.4076
0.1622
0.5362
0.1035
0.1502
0.8153
10.4264
0.8509
0.9328
14.1907
0.8275
0.2257
0.5812
0.592
0.0069
0.9088
0.1515
0
0.6836
0.1351
0.6203
0.5362
0.88
0.8605
0.7454
0
0.01
0.0422
0.0002
0.1851
0.0101
0.0079
0.0295
0.0047
0.5347
0.7639
7.8897
0.7627
0.8832
8.7772
0.6965
0.693
2.3423
28.412
10.3477
0.1035
21.279
Qwen2ForCausalLM
bfloat16
mit
32.764
457
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Llama-70B
0.228
0
0.1595
0.0893
0.0995
0.4451
0
0.7472
0.244
0.2798
0.3557
0.0884
0.3481
0.7544
10.7609
0.7405
0.922
15.5448
0.8018
0.1595
0.3196
0.4828
0.0431
0.63
0.2117
0.0104
0.2403
0.108
0.3458
0.3557
0.8837
0.8617
0.3857
0
0
0
0.1886
0.2796
0
0.0214
0.0192
0.002
0.4037
0.7393
9.4647
0.7131
0.8935
10.0266
0.7332
0.6779
2.6657
23.3446
8.8398
0.0884
18.5641
LlamaForCausalLM
bfloat16
mit
70.554
229
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
deepseek-ai/DeepSeek-R1-Distill-Llama-70B
0.5827
0
0.5344
0.2325
0.7235
0.8675
0.892
0.8453
0.7467
0.5666
0.9124
0.0884
0.6559
0.8568
11.9492
0.9049
0.9557
17.2704
0.8857
0.5344
0.8945
0.658
0.7569
0.9294
0.5414
0.6769
0.7013
0.7797
0.8374
0.9124
0.8865
0.8589
0.7786
0.892
0
0
0.77
0.5025
0.0173
0.2587
0.0796
0.0501
0.7569
0.8294
13.022
0.8264
0.9098
11.7099
0.7642
0.6779
2.6657
23.3446
8.8398
0.0884
18.5641
LlamaForCausalLM
bfloat16
mit
70.554
229
main
4
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
hotmailuser/QwenSlerp2-14B
0.5637
0.6426
0.1958
0.1486
0.6758
0.857
0.812
0.8365
0.7442
0.3349
0.8493
0.1046
0.3984
0.844
10.0102
0.8975
0.9526
16.0273
0.8835
0.1958
0.8953
0.6466
0.8181
0.9348
0.3148
0.6704
0.7523
0.6774
0.8269
0.8493
0.8873
0.8544
0.7409
0.812
0.6426
0.9739
0.6812
0.2914
0.0188
0.013
0.0177
0.0017
0.6917
0.7953
7.8795
0.8159
0.8962
9.2042
0.749
0.6972
2.9102
27.6057
10.4645
0.1046
24.1234
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
2
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
hotmailuser/QwenSlerp2-14B
0.6465
0.6426
0.5667
0.2667
0.7391
0.8815
0.894
0.8453
0.7661
0.5055
0.8991
0.1046
0.5115
0.8611
12.673
0.906
0.9528
16.4635
0.882
0.5667
0.9006
0.6063
0.8097
0.9526
0.5759
0.71
0.8513
0.7696
0.7934
0.8991
0.8869
0.8619
0.7914
0.894
0.6426
0.9739
0.7683
0.429
0.0821
0.377
0.0177
0.0707
0.7862
0.8239
10.4218
0.8353
0.9039
10.7796
0.7577
0.6972
2.9102
27.6057
10.4645
0.1046
24.1234
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
2
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3
0.6404
0.6225
0.5708
0.2674
0.7343
0.8771
0.89
0.8416
0.7573
0.4963
0.8856
0.1018
0.4873
0.8571
12.2369
0.9025
0.9517
16.3534
0.8806
0.5708
0.8945
0.5776
0.7986
0.95
0.5645
0.7057
0.8513
0.7563
0.8027
0.8856
0.8756
0.8574
0.7869
0.89
0.6225
0.9578
0.7629
0.4371
0.084
0.3804
0.0265
0.0729
0.7732
0.8176
9.7327
0.8274
0.9022
10.4444
0.7558
0.6958
2.6661
26.5851
10.1962
0.1018
23.3302
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
11
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3
0.5593
0.6225
0.1956
0.1489
0.6996
0.8547
0.824
0.8306
0.7402
0.3182
0.8167
0.1018
0.3472
0.8382
8.9138
0.8897
0.951
15.4776
0.8808
0.1956
0.8955
0.6466
0.8139
0.933
0.2903
0.667
0.7465
0.6622
0.8317
0.8167
0.8881
0.8567
0.7357
0.824
0.6225
0.9578
0.7322
0.317
0.0215
0.0138
0.0177
0.0032
0.6881
0.7882
7.5397
0.8048
0.8953
9.2972
0.7471
0.6958
2.6661
26.5851
10.1962
0.1018
23.3302
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
11
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Lamarck-14B-v0.6
0.5635
0.6325
0.2087
0.1495
0.6961
0.8548
0.816
0.8361
0.7384
0.3267
0.8372
0.1024
0.3895
0.8448
9.9323
0.8972
0.9522
15.8286
0.8826
0.2087
0.8945
0.6322
0.8153
0.9321
0.2797
0.6687
0.7473
0.6648
0.8326
0.8372
0.8874
0.8561
0.7376
0.816
0.6325
0.9639
0.7235
0.3111
0.0234
0.0121
0.0088
0.001
0.702
0.7955
7.8447
0.8157
0.8964
9.3314
0.749
0.6958
2.6458
26.9372
10.2555
0.1024
23.5995
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
13
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Lamarck-14B-v0.6
0.6466
0.6325
0.5781
0.2665
0.7344
0.8795
0.892
0.8456
0.7643
0.5215
0.8953
0.1024
0.5032
0.8619
12.5569
0.9069
0.9529
16.5154
0.8825
0.5781
0.898
0.5977
0.7986
0.9508
0.5904
0.704
0.8583
0.7689
0.7981
0.8953
0.8809
0.8587
0.7897
0.892
0.6325
0.9639
0.7648
0.4709
0.0791
0.376
0.0177
0.0775
0.7821
0.8235
10.4555
0.8351
0.9041
10.7219
0.7578
0.6958
2.6458
26.9372
10.2555
0.1024
23.5995
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
13
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
maldv/Qwentile2.5-32B-Instruct
0.5715
0.3233
0.234
0.1439
0.7444
0.8716
0.866
0.8454
0.7511
0.4806
0.9113
0.1146
0.4784
0.8528
12.0338
0.9053
0.9549
16.6195
0.8865
0.234
0.9033
0.6322
0.7597
0.9357
0.5139
0.7182
0.8426
0.803
0.7179
0.9113
0.8892
0.8675
0.7757
0.866
0.3233
0.4116
0.7705
0.4496
0.0025
0.0069
0.0088
0.0082
0.6929
0.8076
9.3966
0.8356
0.8995
9.9257
0.7543
0.706
3.1737
28.67
11.4479
0.1146
24.92
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
31
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
maldv/Qwentile2.5-32B-Instruct
0.6457
0.3233
0.5805
0.2917
0.7864
0.8992
0.946
0.8506
0.8059
0.5826
0.9213
0.1146
0.5838
0.8662
13.2438
0.91
0.9557
17.5366
0.8856
0.5805
0.9058
0.6897
0.8236
0.9607
0.6368
0.7639
0.8948
0.8093
0.8121
0.9213
0.903
0.8802
0.8312
0.946
0.3233
0.4116
0.8089
0.5272
0.0321
0.3916
0.1239
0.0975
0.8134
0.8399
12.9473
0.8439
0.9085
11.632
0.7631
0.706
3.1737
28.67
11.4479
0.1146
24.92
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
31
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
Saxo/Linkbricks-Horizon-AI-Avengers-V4-32B
0.4838
0.008
0.1455
0.12
0.6601
0.8714
0.82
0.6421
0.7427
0.3368
0.8703
0.1045
0.298
0.7524
10.7871
0.7002
0.8516
15.6005
0.5858
0.1455
0.9023
0.6207
0.7083
0.9339
0.3785
0.6007
0.8394
0.798
0.7471
0.8703
0.8907
0.8715
0.778
0.82
0.008
0.0161
0.7196
0.334
0.0058
0.0024
0
0.0064
0.5853
0.7242
8.8307
0.6758
0.8424
8.9859
0.6066
0.6872
3.1771
26.7645
10.4431
0.1045
23.3063
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
Saxo/Linkbricks-Horizon-AI-Avengers-V4-32B
0.6106
0.008
0.5759
0.2869
0.7833
0.8965
0.936
0.821
0.8244
0.5672
0.9135
0.1045
0.578
0.8614
13.1877
0.9021
0.9527
17.7238
0.877
0.5759
0.9026
0.7356
0.8542
0.9571
0.591
0.7628
0.8981
0.8005
0.8334
0.9135
0.888
0.8779
0.8298
0.936
0.008
0.0161
0.8037
0.5324
0.0455
0.3418
0.1504
0.105
0.7916
0.7888
11.769
0.7714
0.8964
11.1378
0.7332
0.6872
3.1771
26.7645
10.4431
0.1045
23.3063
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
rombodawg/Rombos-LLM-V2.5-Qwen-32b
0.6558
0.5321
0.588
0.2738
0.7769
0.8969
0.944
0.8476
0.8105
0.541
0.9054
0.0973
0.5542
0.8643
13.2226
0.9077
0.9553
17.6746
0.8859
0.588
0.8985
0.6753
0.8417
0.958
0.5672
0.7535
0.8977
0.7797
0.8579
0.9054
0.8895
0.8772
0.8341
0.944
0.5321
0.761
0.8003
0.5015
0.054
0.3845
0
0.1132
0.8175
0.8291
11.0412
0.8387
0.9044
11.1234
0.7581
0.6931
2.7803
25.9887
9.7254
0.0973
22.6735
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
52
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
rombodawg/Rombos-LLM-V2.5-Qwen-32b
0.5456
0.5321
0.104
0.1468
0.5748
0.874
0.794
0.8386
0.7642
0.3892
0.8869
0.0973
0.441
0.8486
11.2433
0.901
0.9512
15.6734
0.8802
0.104
0.9043
0.6494
0.7944
0.9303
0.2681
0.5645
0.8188
0.7961
0.7623
0.8869
0.8954
0.8763
0.7875
0.794
0.5321
0.761
0.5851
0.4585
0.0281
0.0068
0.0354
0.0048
0.6589
0.8002
8.7465
0.8266
0.8967
9.6053
0.7466
0.6931
2.7803
25.9887
9.7254
0.0973
22.6735
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
52
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
huihui-ai/QwQ-32B-Coder-Fusion-9010
0.5861
0.6787
0.2663
0.1482
0.726
0.851
0.818
0.8413
0.7173
0.3712
0.9083
0.1205
0.2883
0.8508
11.8046
0.9031
0.9537
17.0002
0.8846
0.2663
0.8875
0.5891
0.7181
0.9205
0.4199
0.7055
0.7847
0.7898
0.7047
0.9083
0.8834
0.8608
0.745
0.818
0.6787
0.9598
0.7465
0.4055
0
0.0058
0.0265
0.0117
0.6968
0.7999
8.7856
0.8252
0.8967
9.8045
0.7525
0.7107
3.3568
30.069
12.0633
0.1205
25.87
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
6
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
huihui-ai/QwQ-32B-Coder-Fusion-9010
0.6687
0.6787
0.5718
0.2842
0.78
0.8863
0.928
0.8459
0.7777
0.5614
0.9211
0.1205
0.5616
0.8603
12.6168
0.9071
0.9546
17.0898
0.8845
0.5718
0.8873
0.6552
0.7681
0.9508
0.6308
0.7616
0.8755
0.8037
0.7859
0.9211
0.8806
0.8675
0.8209
0.928
0.6787
0.9598
0.7983
0.4917
0.0216
0.4054
0.1416
0.083
0.7692
0.8296
11.9736
0.8307
0.9055
11.4388
0.7614
0.7107
3.3568
30.069
12.0633
0.1205
25.87
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
6
main
4
True
v1.4.1
v0.6.3.post1
🟦 : RL-tuned (Preference optimization)
OpenBuddy/openbuddy-qwq-32b-v24.1-200k
0.5283
0.0863
0.238
0.1531
0.6368
0.8645
0.864
0.8462
0.7195
0.415
0.903
0.0851
0.387
0.8641
13.207
0.9071
0.9555
17.3732
0.8864
0.238
0.8955
0.6006
0.7278
0.9374
0.4488
0.6668
0.744
0.7803
0.7449
0.903
0.8863
0.8652
0.7607
0.864
0.0863
0.2048
0.6069
0.4092
0
0.001
0
0.0004
0.764
0.8155
10.073
0.8312
0.9045
10.8566
0.76
0.6852
2.963
20.4271
8.5192
0.0851
17.6866
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
2
main
0
True
v1.4.1
v0.6.3.post1
🟦 : RL-tuned (Preference optimization)
OpenBuddy/openbuddy-qwq-32b-v24.1-200k
0.6198
0.0863
0.5945
0.3002
0.7728
0.8924
0.934
0.8548
0.797
0.5719
0.9285
0.0851
0.5986
0.8715
13.5172
0.9124
0.9577
18.6591
0.8899
0.5945
0.8958
0.6609
0.8333
0.9562
0.5515
0.7436
0.8961
0.8043
0.7905
0.9285
0.8991
0.8794
0.8251
0.934
0.0863
0.2048
0.8021
0.5658
0.0351
0.4103
0.1593
0.084
0.8123
0.8473
13.5696
0.8484
0.9105
11.441
0.7686
0.6852
2.963
20.4271
8.5192
0.0851
17.6866
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
2
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
hotmailuser/RombosBeagle-v2beta-MGS-32B
0.5441
0.5301
0.1023
0.1469
0.5726
0.8745
0.782
0.8383
0.7643
0.3906
0.8882
0.0957
0.4415
0.8487
11.2853
0.8999
0.9511
15.5862
0.8798
0.1023
0.9048
0.6494
0.7944
0.9312
0.2681
0.5626
0.8176
0.7986
0.7615
0.8882
0.895
0.8761
0.7874
0.782
0.5301
0.755
0.5826
0.4623
0.0281
0.0065
0.0354
0.0071
0.6575
0.8002
8.752
0.8266
0.8968
9.5491
0.7469
0.6918
2.7757
25.4437
9.5626
0.0957
22.244
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
hotmailuser/RombosBeagle-v2beta-MGS-32B
0.6548
0.5301
0.5846
0.2732
0.7756
0.8965
0.944
0.8476
0.8103
0.5395
0.9056
0.0957
0.5552
0.8642
13.2052
0.9078
0.9552
17.689
0.8858
0.5846
0.8985
0.6753
0.8417
0.9571
0.5657
0.7515
0.8965
0.7809
0.8571
0.9056
0.8902
0.8786
0.8338
0.944
0.5301
0.755
0.7998
0.4977
0.0511
0.3807
0
0.115
0.8194
0.8292
11.0168
0.8389
0.9043
11.1169
0.758
0.6918
2.7757
25.4437
9.5626
0.0957
22.244
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
4
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/SJT-2.4B-Alpha
0.3788
0.0321
0.3924
0.0674
0.3594
0.493
0.498
0.7307
0.5155
0.236
0.7913
0.0508
0.1912
0.8008
7.1291
0.8195
0.9237
11.0633
0.816
0.3924
0.7087
0.3276
0.5069
0.4602
0.3008
0.3474
0.5616
0.5852
0.5961
0.7913
0.5187
0.4502
0.3101
0.498
0.0321
0.1245
0.3714
0.2159
0.0014
0.0689
0.0177
0.0032
0.2455
0.7357
5.934
0.68
0.8566
7.3766
0.6074
0.6382
1.3203
15.799
5.0597
0.0508
13.0353
Qwen2ForCausalLM
float16
apache-2.0
2.199
0
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/SJT-2.4B-Alpha
0.2425
0.0321
0
0.0293
0.3214
0.4097
0.03
0.6012
0.5011
0.0471
0.6442
0.0508
0.0582
0.7032
2.0223
0.6183
0.8669
6.7091
0.7416
0
0.5341
0.3391
0.4875
0.4155
0
0.3188
0.5534
0.5082
0.6174
0.6442
0.218
0.2095
0.2795
0.03
0.0321
0.1245
0.324
0.0832
0
0.0026
0
0
0.1439
0.6642
1.6266
0.5125
0.7867
2.7398
0.5325
0.6382
1.3203
15.799
5.0597
0.0508
13.0353
Qwen2ForCausalLM
float16
apache-2.0
2.199
0
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Lamarck-14B-v0.7
0.6498
0.6285
0.5702
0.2694
0.7388
0.883
0.898
0.8465
0.7667
0.5354
0.9054
0.1059
0.5178
0.8622
12.3796
0.907
0.9535
16.7361
0.8834
0.5702
0.9021
0.6063
0.8
0.9508
0.5888
0.7111
0.8607
0.7765
0.7897
0.9054
0.8882
0.8633
0.796
0.898
0.6285
0.9819
0.7665
0.4996
0.0787
0.3735
0.0442
0.0775
0.773
0.826
10.7207
0.8375
0.9042
10.8731
0.7581
0.7013
2.7974
28.0711
10.6098
0.1059
24.5474
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
15
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Lamarck-14B-v0.7
0.5725
0.6285
0.2397
0.1543
0.699
0.8587
0.816
0.838
0.7363
0.3618
0.8595
0.1059
0.4012
0.8453
10.4001
0.8985
0.9533
15.7901
0.8838
0.2397
0.8983
0.6264
0.8097
0.9374
0.3522
0.6653
0.7777
0.6477
0.82
0.8595
0.8918
0.8599
0.7403
0.816
0.6285
0.9819
0.7327
0.3321
0.0257
0.0139
0.0354
0.0017
0.6949
0.7961
8.0936
0.8187
0.8967
9.2398
0.7508
0.7013
2.7974
28.0711
10.6098
0.1059
24.5474
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
15
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwenvergence-14B-v10
0.6413
0.6546
0.5543
0.2753
0.7218
0.8712
0.888
0.8431
0.7593
0.5098
0.8743
0.1032
0.4825
0.8588
12.1229
0.9031
0.9519
16.065
0.8818
0.5543
0.8868
0.5805
0.775
0.9464
0.5523
0.6888
0.8624
0.762
0.8165
0.8743
0.8827
0.8572
0.7805
0.888
0.6546
0.9639
0.7547
0.4945
0.0581
0.3931
0.0708
0.0881
0.7662
0.8193
10.2449
0.8312
0.9032
10.763
0.7562
0.6988
2.6335
27.1809
10.3292
0.1032
23.7442
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
4
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwenvergence-14B-v10
0.5565
0.6546
0.2264
0.1569
0.6867
0.8399
0.822
0.832
0.7163
0.3094
0.7738
0.1032
0.3037
0.8397
9.7849
0.8921
0.9501
15.0714
0.8794
0.2264
0.8858
0.6178
0.7736
0.9169
0.277
0.6459
0.7621
0.6042
0.824
0.7738
0.8837
0.8561
0.7171
0.822
0.6546
0.9639
0.7275
0.3475
0.0242
0.0077
0.0619
0.0027
0.6879
0.7906
7.6428
0.8115
0.8951
9.3056
0.7448
0.6988
2.6335
27.1809
10.3292
0.1032
23.7442
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
4
main
0
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/ultiima-14B-v0.2
0.6467
0.6365
0.575
0.2668
0.7355
0.8799
0.892
0.8458
0.7649
0.5192
0.8954
0.1031
0.5032
0.8615
12.4979
0.907
0.9529
16.4395
0.8828
0.575
0.8988
0.6006
0.8014
0.9508
0.5877
0.7066
0.8574
0.7658
0.7993
0.8954
0.8828
0.8597
0.7901
0.892
0.6365
0.9598
0.7645
0.4667
0.0803
0.3744
0.0177
0.0795
0.7823
0.8238
10.421
0.8349
0.9041
10.7092
0.7585
0.6957
2.7022
26.9496
10.3204
0.1031
23.6729
Qwen2ForCausalLM
float16
14.766
2
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/ultiima-14B-v0.2
0.5638
0.6365
0.2048
0.1483
0.6968
0.8557
0.814
0.8363
0.7383
0.3298
0.8377
0.1031
0.3881
0.8449
9.9801
0.8975
0.9519
15.8518
0.8821
0.2048
0.895
0.6322
0.8139
0.933
0.2797
0.6679
0.7457
0.6654
0.8344
0.8377
0.8872
0.8551
0.7391
0.814
0.6365
0.9598
0.7257
0.3216
0.0234
0.0131
0.0088
0.001
0.695
0.7954
7.9523
0.8162
0.8965
9.2722
0.7492
0.6957
2.7022
26.9496
10.3204
0.1031
23.6729
Qwen2ForCausalLM
float16
14.766
2
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
FuseAI/FuseO1-DeepSeekR1-QwQ-SkyT1-32B-Preview
0.2608
0.002
0.2205
0.126
0
0.6027
0
0.7872
0.313
0.1733
0.5396
0.1041
0.1515
0.8177
10.7503
0.8543
0.9332
14.4756
0.8284
0.2205
0.1428
0.592
0.0014
0.9178
0.1773
0
0.5357
0.036
0.3998
0.5396
0.8933
0.8706
0.7476
0
0.002
0.0422
0
0.191
0.0151
0.0122
0.0295
0.0043
0.5692
0.7653
7.9661
0.766
0.8843
8.9062
0.7
0.6928
2.4424
28.4702
10.4272
0.1041
21.1987
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
75
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
FuseAI/FuseO1-DeepSeekR1-QwQ-SkyT1-32B-Preview
0.5968
0.002
0.5819
0.2656
0.7465
0.8836
0.928
0.845
0.7746
0.5402
0.893
0.1041
0.5345
0.8605
12.2451
0.9061
0.9536
17.1126
0.885
0.5819
0.8988
0.6638
0.7611
0.9482
0.5566
0.7148
0.8726
0.7929
0.7826
0.893
0.9045
0.8795
0.8039
0.928
0.002
0.0422
0.7782
0.5295
0.0148
0.366
0.1504
0.0417
0.755
0.8214
10.3568
0.8304
0.9042
10.9109
0.7585
0.6928
2.4424
28.4702
10.4272
0.1041
21.1987
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
75
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
cyberagent/DeepSeek-R1-Distill-Qwen-32B-Japanese
0.2749
0.0402
0.25
0.1361
0
0.6748
0
0.796
0.2664
0.193
0.5597
0.1074
0.248
0.8261
11.0019
0.8638
0.9344
14.2124
0.8309
0.25
0.3545
0.592
0.0014
0.9196
0.1089
0
0.3595
0.0069
0.3722
0.5597
0.8778
0.8498
0.7505
0
0.0402
0.1285
0
0.222
0.009
0.0257
0.0248
0
0.6211
0.7751
8.1994
0.7824
0.8841
8.6625
0.7068
0.6982
2.4046
29.7102
10.7517
0.1074
21.3666
Qwen2ForCausalLM
bfloat16
mit
32.764
178
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
cyberagent/DeepSeek-R1-Distill-Qwen-32B-Japanese
0.601
0.0402
0.586
0.2803
0.7474
0.8854
0.918
0.8457
0.767
0.5242
0.9097
0.1074
0.5472
0.8636
12.8788
0.9101
0.9534
16.845
0.8841
0.586
0.895
0.6494
0.7694
0.9562
0.5163
0.7238
0.8595
0.8024
0.7544
0.9097
0.9031
0.8755
0.8051
0.918
0.0402
0.1285
0.7709
0.5091
0.0127
0.374
0.1593
0.0712
0.7844
0.8247
10.5173
0.8351
0.9014
10.6611
0.7535
0.6982
2.4046
29.7102
10.7517
0.1074
21.3666
Qwen2ForCausalLM
bfloat16
mit
32.764
178
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
cyberagent/DeepSeek-R1-Distill-Qwen-14B-Japanese
0.253
0
0.2317
0.0406
0.3131
0.393
0.018
0.6805
0.3714
0.1185
0.5089
0.1072
0.0708
0.7744
6.0921
0.7583
0.8754
7.701
0.7673
0.2317
0.0797
0.4655
0.3458
0.6122
0.1289
0.1694
0.463
0.0884
0.4942
0.5089
0.4599
0.4776
0.4872
0.018
0
0
0.4567
0.1558
0.005
0.0031
0.0177
0
0.1774
0.7078
4.5242
0.6137
0.8084
3.87
0.5826
0.702
1.683
30.9219
10.7181
0.1072
22.9303
Qwen2ForCausalLM
bfloat16
mit
14.77
49
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
cyberagent/DeepSeek-R1-Distill-Qwen-14B-Japanese
0.5644
0
0.5718
0.2271
0.6729
0.8498
0.804
0.833
0.7556
0.4902
0.8962
0.1072
0.4801
0.8557
11.6901
0.902
0.9477
14.7898
0.8744
0.5718
0.8722
0.5833
0.7694
0.9428
0.5136
0.6365
0.8381
0.7797
0.8076
0.8962
0.8637
0.8333
0.7342
0.804
0
0
0.7092
0.477
0.0161
0.3131
0.0796
0.0522
0.6746
0.8067
8.927
0.818
0.8948
9.5409
0.7376
0.702
1.683
30.9219
10.7181
0.1072
22.9303
Qwen2ForCausalLM
bfloat16
mit
14.77
49
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Elizezen/Kudryavka-8B-alpha
0.2976
0.3233
0.136
0.065
0.3728
0.4982
0.054
0.7729
0.248
0.1097
0.6013
0.0925
0.0764
0.7908
7.3396
0.8267
0.9354
12.3221
0.8488
0.136
0.5701
0.4856
0.0208
0.5961
0.1707
0.2954
0.3324
0.0827
0.3184
0.6013
0.7943
0.7428
0.3284
0.054
0.3233
0.8916
0.4501
0.0819
0.0093
0
0
0
0.3155
0.7291
5.884
0.7278
0.8763
8.2281
0.6882
0.6843
3.0903
24.0948
9.2543
0.0925
21.0558
MistralForCausalLM
bfloat16
other
8.02
0
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Elizezen/Kudryavka-8B-alpha
0.5175
0.3233
0.4948
0.1617
0.5315
0.7407
0.674
0.8088
0.6041
0.4318
0.829
0.0925
0.3817
0.8389
10.1666
0.887
0.9401
13.7312
0.8597
0.4948
0.8605
0.4569
0.6361
0.8624
0.5246
0.4759
0.7679
0.4836
0.6761
0.829
0.7905
0.7543
0.4993
0.674
0.3233
0.8916
0.5872
0.389
0.004
0.1837
0
0.0678
0.5533
0.7747
8.0708
0.7717
0.8861
8.9736
0.7166
0.6843
3.0903
24.0948
9.2543
0.0925
21.0558
MistralForCausalLM
bfloat16
other
8.02
0
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Elizezen/Phos-7B
0.2288
0.01
0.0067
0.1108
0.2709
0.162
0.022
0.7891
0.3277
0.2212
0.4896
0.1067
0.2746
0.8012
8.1018
0.8553
0.9314
10.3044
0.8471
0.0067
0.0253
0.3305
0.4653
0.1895
0.2412
0.1841
0.2297
0.303
0.3101
0.4896
-0.0595
-0.0401
0.2713
0.022
0.01
0.0663
0.3576
0.1477
0.002
0.0025
0
0
0.5495
0.7333
5.5543
0.7603
0.8745
7.1771
0.6936
0.6977
2.5312
29.0285
10.6739
0.1067
24.9716
MistralForCausalLM
bfloat16
7.242
0
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Elizezen/Phos-7B
0.4552
0.01
0.4652
0.1529
0.4275
0.7865
0.392
0.8138
0.5382
0.4482
0.8665
0.1067
0.5121
0.8425
10.196
0.8944
0.9366
12.4632
0.8553
0.4652
0.8464
0.5517
0.6264
0.8749
0.4294
0.4092
0.4721
0.6869
0.3542
0.8665
0.7695
0.7349
0.638
0.392
0.01
0.0663
0.4459
0.403
0
0.0851
0.0177
0.0186
0.6433
0.7705
6.9237
0.7917
0.8832
8.0637
0.714
0.6977
2.5312
29.0285
10.6739
0.1067
24.9716
MistralForCausalLM
bfloat16
7.242
0
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Elizezen/SniffyOtter-7B
0.4003
0.0241
0.4194
0.1087
0.3652
0.629
0.328
0.7835
0.4014
0.4118
0.8284
0.1036
0.4157
0.8283
9.0917
0.8782
0.9269
11.5588
0.8306
0.4194
0.7392
0.4138
0.5
0.7319
0.4514
0.3592
0.2436
0.6231
0.2263
0.8284
0.5355
0.5133
0.4159
0.328
0.0241
0.2129
0.3712
0.3684
0.002
0.0753
0
0.0036
0.4628
0.7367
5.0849
0.7425
0.873
7.021
0.6829
0.6939
2.7184
30.1101
10.363
0.1036
25.5767
MistralForCausalLM
bfloat16
cc-by-nc-4.0
7.242
6
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Elizezen/SniffyOtter-7B
0.247
0.0241
0.0218
0.0688
0.1879
0.2313
0.058
0.7679
0.3407
0.2327
0.6803
0.1036
0.2588
0.8046
7.2539
0.8668
0.9125
8.6854
0.8067
0.0218
0.2482
0.3305
0.4958
0.1939
0.2706
0.1372
0.1865
0.4691
0.2214
0.6803
-0.2319
-0.216
0.2518
0.058
0.0241
0.2129
0.2385
0.1687
0
0
0
0
0.3438
0.7295
4.6138
0.7468
0.8632
6.4925
0.6512
0.6939
2.7184
30.1101
10.363
0.1036
25.5767
MistralForCausalLM
bfloat16
cc-by-nc-4.0
7.242
6
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Elizezen/Hameln-japanese-mistral-7B
0.1626
0.002
0
0.0448
0.1032
0.218
0
0.6561
0.1834
0.1757
0.3251
0.0807
0.1442
0.7671
4.5195
0.7888
0.7989
2.6227
0.6528
0
0.2182
0.2586
0.2861
0.185
0.2612
0.1056
0.145
0.0063
0.2208
0.3251
-0.0158
0.0009
0.2507
0
0.002
0.012
0.1007
0.1217
0
0
0
0
0.2241
0.6755
2.847
0.6499
0.8071
3.5303
0.5327
0.6573
2.0445
24.6807
8.0736
0.0807
20.3576
MistralForCausalLM
float16
7.242
5
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Elizezen/Hameln-japanese-mistral-7B
0.3426
0.002
0.3175
0.1002
0.3326
0.5397
0.232
0.7455
0.3006
0.369
0.7483
0.0807
0.374
0.8145
8.4954
0.8488
0.9206
9.8658
0.8155
0.3175
0.7325
0.3764
0.5
0.5496
0.4051
0.3321
0.1832
0.2222
0.221
0.7483
0.2431
0.254
0.3371
0.232
0.002
0.012
0.3331
0.3279
0.0017
0.0162
0.0177
0.0146
0.4508
0.7145
4.5466
0.6919
0.8589
6.9049
0.6258
0.6573
2.0445
24.6807
8.0736
0.0807
20.3576
MistralForCausalLM
float16
7.242
5
main
4
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
nitky/RoguePlanet-DeepSeek-R1-Qwen-32B-RP
0.5917
0.6526
0.2062
0.1826
0.7437
0.873
0.842
0.8447
0.7202
0.4066
0.9046
0.1326
0.4187
0.853
12.2024
0.9071
0.9526
15.6972
0.8835
0.2062
0.9041
0.6063
0.7569
0.9348
0.3677
0.7151
0.7905
0.803
0.6442
0.9046
0.8996
0.8743
0.7803
0.842
0.6526
0.9859
0.7724
0.4335
0.0313
0.0481
0.0619
0.0099
0.762
0.8119
9.3453
0.8352
0.8951
9.6149
0.753
0.716
3.2994
35.5807
13.2673
0.1326
29.8536
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
2
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
nitky/RoguePlanet-DeepSeek-R1-Qwen-32B-RP
0.6737
0.6526
0.5913
0.3004
0.7787
0.8993
0.932
0.8491
0.7884
0.5702
0.9162
0.1326
0.5826
0.8624
13.0074
0.9093
0.9543
16.9781
0.8855
0.5913
0.9136
0.6868
0.7972
0.9634
0.5885
0.7566
0.8919
0.8087
0.7573
0.9162
0.9019
0.8771
0.821
0.932
0.6526
0.9859
0.8008
0.5394
0.0165
0.4073
0.1858
0.077
0.8155
0.8306
11.1728
0.8398
0.9049
11.0904
0.7619
0.716
3.2994
35.5807
13.2673
0.1326
29.8536
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
2
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
abeja/ABEJA-Qwen2.5-32b-Japanese-v0.1
0.5998
0.1305
0.5635
0.1742
0.7885
0.9167
0.94
0.6359
0.8156
0.6702
0.9067
0.056
0.742
0.7028
13.8308
0.5861
0.876
16.8251
0.6773
0.5635
0.9271
0.6983
0.8681
0.9705
0.6613
0.7772
0.8961
0.7835
0.8324
0.9067
0.8906
0.8776
0.8526
0.94
0.1305
0.2871
0.7998
0.6074
0.0826
0.175
0.0885
0.12
0.4047
0.6877
12.4455
0.6165
0.8641
10.9284
0.6637
0.6425
2.7917
15.5777
5.6085
0.056
12.8721
Qwen2ForCausalLM
float16
apache-2.0
32.764
8
main
4
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
abeja/ABEJA-Qwen2.5-32b-Japanese-v0.1
0.1897
0.1305
0.026
0.0473
0.0181
0.6165
0
0.5309
0.2485
0.0946
0.318
0.056
0.1246
0.6664
12.917
0.5555
0.8017
15.4337
0.4539
0.026
0.1027
0.5833
0
0.9491
0.112
0.0068
0.297
0
0.3619
0.318
0.9052
0.8769
0.7977
0
0.1305
0.2871
0.0294
0.0471
0
0.0092
0
0.0008
0.2267
0.6321
9.3672
0.5738
0.8118
8.8144
0.5403
0.6425
2.7917
15.5777
5.6085
0.056
12.8721
Qwen2ForCausalLM
float16
apache-2.0
32.764
8
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
TeamDelta/Re-ultiima-14B
0.5512
0.4739
0.2263
0.1325
0.6875
0.8535
0.796
0.8036
0.7461
0.3916
0.8541
0.0977
0.4037
0.8416
9.1854
0.8953
0.9179
15.5119
0.8021
0.2263
0.8933
0.6322
0.8278
0.9276
0.3398
0.6631
0.7691
0.69
0.8114
0.8541
0.8835
0.8549
0.7396
0.796
0.4739
0.7249
0.712
0.4313
0.0252
0.0093
0.0177
0.0008
0.6097
0.792
7.5911
0.8119
0.875
8.837
0.7049
0.6937
2.8272
25.4006
9.7768
0.0977
22.3533
Qwen2ForCausalLM
float16
apache-2.0
14.77
2
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
TeamDelta/Re-ultiima-14B
0.6304
0.4739
0.5614
0.2571
0.7349
0.8812
0.89
0.8402
0.7584
0.5385
0.9016
0.0977
0.5095
0.8625
12.5037
0.9079
0.9438
16.7299
0.8622
0.5614
0.8988
0.6236
0.8264
0.9535
0.575
0.7052
0.8246
0.7588
0.7585
0.9016
0.8871
0.8576
0.7914
0.89
0.4739
0.7249
0.7646
0.531
0.0956
0.329
0.0147
0.0817
0.7645
0.8214
10.0001
0.8345
0.9021
10.5536
0.7564
0.6937
2.8272
25.4006
9.7768
0.0977
22.3533
Qwen2ForCausalLM
float16
apache-2.0
14.77
2
main
4
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
DataPilot/Arrival-32B-Instruct-v0.5
0.6193
0.0361
0.5733
0.3034
0.7817
0.8992
0.936
0.8504
0.7925
0.5955
0.9284
0.1156
0.6286
0.8686
13.2278
0.9114
0.9564
17.7209
0.8866
0.5733
0.9086
0.6724
0.8167
0.9544
0.6014
0.7591
0.9035
0.8018
0.7682
0.9284
0.9017
0.878
0.8345
0.936
0.0361
0.1365
0.8042
0.5566
0.0325
0.4082
0.1416
0.0925
0.8421
0.8444
14.559
0.8441
0.9084
11.5848
0.7595
0.6974
3.8413
26.8296
11.5529
0.1156
23.6805
Qwen2ForCausalLM
float16
apache-2.0
32.764
0
main
4
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
DataPilot/Arrival-32B-Instruct-v0.5
0.3592
0.0361
0.2289
0.1223
0.2268
0.8689
0.52
0.6106
0.5571
0.1586
0.5066
0.1156
0.1627
0.7539
12.428
0.6975
0.8402
16.8662
0.5441
0.2289
0.8973
0.5977
0.1792
0.9366
0.2108
0.3996
0.7387
0.7456
0.5245
0.5066
0.9001
0.8687
0.7727
0.52
0.0361
0.1365
0.054
0.1024
0
0.0021
0
0.0018
0.6076
0.7146
9.1587
0.6569
0.8209
9.2494
0.5439
0.6974
3.8413
26.8296
11.5529
0.1156
23.6805
Qwen2ForCausalLM
float16
apache-2.0
32.764
0
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
DataPilot/SKYDRIVE-32B-v0.1
0.4598
0.0221
0.1933
0.1417
0.654
0.8819
0.644
0.6865
0.7414
0.4062
0.579
0.108
0.3528
0.7238
12.1181
0.712
0.8842
16.1165
0.6844
0.1933
0.9073
0.6466
0.7097
0.9428
0.4273
0.5414
0.7913
0.8093
0.7502
0.579
0.8966
0.8795
0.7955
0.644
0.0221
0.0924
0.7667
0.4386
0.005
0.0087
0
0.0092
0.6855
0.7199
9.3944
0.7002
0.8621
9.3798
0.6494
0.6895
3.2954
27.2212
10.8122
0.108
22.3703
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
0
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
DataPilot/SKYDRIVE-32B-v0.1
0.6244
0.0221
0.5968
0.3221
0.7903
0.9054
0.936
0.8533
0.8083
0.6025
0.924
0.108
0.6337
0.869
14.2445
0.9128
0.9564
17.6795
0.8872
0.5968
0.9098
0.7126
0.825
0.9643
0.5977
0.7727
0.9055
0.8106
0.7879
0.924
0.8993
0.8806
0.8422
0.936
0.0221
0.0924
0.8079
0.5761
0.0371
0.4492
0.1681
0.0966
0.8593
0.8425
12.7077
0.8464
0.91
11.837
0.7667
0.6895
3.2954
27.2212
10.8122
0.108
22.3703
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
4
False
v1.4.1
v0.6.3.post1
🟦 : RL-tuned (Preference optimization)
Qwen/Qwen2.5-14B-Instruct-1M
0.4999
0.6386
0.2511
0.1405
0.54
0.8499
0.272
0.8385
0.749
0.3152
0.8062
0.0979
0.3605
0.8412
10.7022
0.9005
0.9522
15.2455
0.8835
0.2511
0.8775
0.6466
0.8028
0.9312
0.2687
0.5713
0.7169
0.7304
0.8484
0.8062
0.8816
0.8653
0.7411
0.272
0.6386
0.9578
0.5088
0.3166
0.0115
0.0076
0.0177
0
0.6656
0.7909
8.2461
0.8204
0.8976
9.6223
0.7496
0.6929
2.6984
26.447
9.7939
0.0979
22.9868
Qwen2ForCausalLM
bfloat16
apache-2.0
14.77
205
main
0
True
v1.4.1
v0.6.3.post1
🟦 : RL-tuned (Preference optimization)
Qwen/Qwen2.5-14B-Instruct-1M
0.6453
0.6386
0.5626
0.2779
0.726
0.8812
0.87
0.8442
0.7774
0.5219
0.901
0.0979
0.5246
0.86
12.6214
0.9075
0.9532
16.9305
0.8825
0.5626
0.893
0.6351
0.825
0.9571
0.5238
0.6978
0.8398
0.779
0.8082
0.901
0.8966
0.8781
0.7933
0.87
0.6386
0.9578
0.7541
0.5174
0.0437
0.4123
0.0619
0.0859
0.7857
0.813
9.4542
0.8332
0.904
10.8009
0.7536
0.6929
2.6984
26.447
9.7939
0.0979
22.9868
Qwen2ForCausalLM
bfloat16
apache-2.0
14.77
205
main
4
True
v1.4.1
v0.6.3.post1
🟦 : RL-tuned (Preference optimization)
Qwen/Qwen2.5-7B-Instruct-1M
0.5426
0.0402
0.4972
0.2155
0.6507
0.8411
0.766
0.8303
0.7503
0.3813
0.9074
0.0888
0.3718
0.8425
11.1148
0.8945
0.949
15.3523
0.876
0.4972
0.8575
0.6523
0.725
0.9214
0.4152
0.6162
0.8254
0.7513
0.7974
0.9074
0.8777
0.8546
0.7444
0.766
0.0402
0.0582
0.6852
0.3571
0.013
0.3384
0.0088
0.0844
0.633
0.787
7.8647
0.8103
0.8933
9.6075
0.7404
0.6859
2.3934
25.7085
8.8851
0.0888
22.0158
Qwen2ForCausalLM
bfloat16
apache-2.0
7.616
168
main
4
True
v1.4.1
v0.6.3.post1
🟦 : RL-tuned (Preference optimization)
Qwen/Qwen2.5-7B-Instruct-1M
0.4379
0.0402
0.2039
0.1036
0.5945
0.749
0.432
0.8136
0.7125
0.2019
0.8774
0.0888
0.2819
0.8238
7.9991
0.8782
0.9436
12.4437
0.8693
0.2039
0.856
0.5833
0.6556
0.7587
0.187
0.551
0.7362
0.7443
0.8429
0.8774
0.8447
0.8571
0.6322
0.432
0.0402
0.0582
0.6381
0.1369
0.0011
0.0132
0
0
0.5038
0.7626
5.9982
0.7863
0.8838
8.1142
0.7207
0.6859
2.3934
25.7085
8.8851
0.0888
22.0158
Qwen2ForCausalLM
bfloat16
apache-2.0
7.616
168
main
0
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
huihui-ai/DeepSeek-R1-Distill-Qwen-32B-abliterated
0.5777
0.1466
0.5233
0.2394
0.6932
0.812
0.88
0.8191
0.7593
0.4599
0.9044
0.1174
0.4274
0.8503
11.3477
0.8926
0.9462
14.9584
0.8683
0.5233
0.7986
0.6207
0.7306
0.9142
0.5454
0.6637
0.8394
0.7847
0.8214
0.9044
0.8969
0.8641
0.7231
0.88
0.1466
0.3614
0.7227
0.4068
0.024
0.3463
0.0973
0.0229
0.7062
0.792
8.55
0.7828
0.8959
9.9202
0.7328
0.7076
2.5167
33.219
11.7441
0.1174
26.0023
Qwen2ForCausalLM
bfloat16
32.764
58
main
4
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
huihui-ai/DeepSeek-R1-Distill-Qwen-32B-abliterated
0.2418
0.1466
0.1071
0.0997
0
0.4931
0
0.7644
0.2493
0.1573
0.5249
0.1174
0.1315
0.7989
8.3376
0.8133
0.9352
13.6895
0.836
0.1071
0
0.5402
0
0.8597
0.1751
0
0.3541
0
0.3519
0.5249
0.807
0.7868
0.6197
0
0.1466
0.3614
0
0.1654
0.0008
0.0144
0.0059
0
0.4773
0.7372
6.321
0.7112
0.8836
8.0274
0.6971
0.7076
2.5167
33.219
11.7441
0.1174
26.0023
Qwen2ForCausalLM
bfloat16
32.764
58
main
0
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
nicoboss/DeepSeek-R1-Distill-Qwen-32B-Uncensored
0.4494
0.4398
0.3053
0.1406
0.3356
0.8385
0.422
0.8285
0.6795
0.2341
0.6177
0.1017
0.2537
0.8408
10.5351
0.8935
0.9492
15.6976
0.8748
0.3053
0.8903
0.592
0.6625
0.8999
0.177
0.0435
0.7284
0.7828
0.632
0.6177
0.897
0.8738
0.7252
0.422
0.4398
0.7189
0.6276
0.2715
0.0198
0.0047
0.0708
0.0017
0.6058
0.7946
8.9477
0.8068
0.8928
9.9894
0.739
0.6965
3.0622
25.281
10.187
0.1017
21.9319
Qwen2ForCausalLM
bfloat16
mit
32.76
6
main
0
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
nicoboss/DeepSeek-R1-Distill-Qwen-32B-Uncensored
0.6324
0.4398
0.5535
0.2552
0.7614
0.8845
0.91
0.8338
0.7698
0.5287
0.9186
0.1017
0.5167
0.8512
10.6427
0.9001
0.9492
16.194
0.8732
0.5535
0.8945
0.6753
0.7625
0.9562
0.5869
0.7326
0.8944
0.791
0.7258
0.9186
0.8938
0.8723
0.8028
0.91
0.4398
0.7189
0.7902
0.4825
0.006
0.3569
0.1681
0.0275
0.7173
0.8119
11.0104
0.8082
0.9027
11.0451
0.7537
0.6965
3.0622
25.281
10.187
0.1017
21.9319
Qwen2ForCausalLM
bfloat16
mit
32.76
6
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
prithivMLmods/Qwen2.5-32B-DeepSeek-R1-Instruct
0.2498
0.0201
0.116
0.1564
0.0006
0.5663
0.01
0.7894
0.3957
0.1579
0.4402
0.0952
0.1644
0.7932
10.7724
0.8441
0.9385
15.5524
0.8501
0.116
0.005
0.5891
0.4875
0.9276
0.1728
0
0.4314
0.0878
0.3826
0.4402
0.656
0.8656
0.7664
0.01
0.0201
0.1185
0.0012
0.1364
0.0318
0.0126
0.0512
0.0059
0.6802
0.7381
8.408
0.7559
0.8808
8.9501
0.7075
0.6899
2.1685
28.0892
9.5306
0.0952
23.9055
Qwen2ForCausalLM
bfloat16
32.764
9
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
prithivMLmods/Qwen2.5-32B-DeepSeek-R1-Instruct
0.3303
0.0201
0.5271
0.2612
0.0186
0.7707
0.098
0.7953
0.4723
0.2049
0.3702
0.0952
0.1504
0.8407
12.1501
0.8656
0.9424
17.0151
0.8571
0.5271
0.5453
0.5891
0.1069
0.9535
0.2633
0.0175
0.4244
0.5947
0.6464
0.3702
0.8983
0.8764
0.8131
0.098
0.0201
0.1185
0.0197
0.201
0.0475
0.335
0.1363
0.0444
0.7428
0.7768
9.6497
0.7411
0.8875
10.5323
0.7173
0.6899
2.1685
28.0892
9.5306
0.0952
23.9055
Qwen2ForCausalLM
bfloat16
32.764
9
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
prithivMLmods/Qwen2.5-14B-DeepSeek-R1-1M
0.3891
0.2149
0.2664
0.0667
0.399
0.5616
0.146
0.793
0.7052
0.221
0.8025
0.1035
0.191
0.7985
7.6163
0.8178
0.9411
12.5682
0.8673
0.2664
0.7099
0.5546
0.6653
0.454
0.2038
0.5747
0.6931
0.7551
0.8579
0.8025
0.8923
0.8719
0.521
0.146
0.2149
0.2952
0.2233
0.2684
0.0037
0.0031
0.042
0.002
0.2824
0.7625
7.1759
0.7563
0.8873
8.8426
0.7305
0.6982
2.1385
31.1958
10.3353
0.1035
26.1257
Qwen2ForCausalLM
bfloat16
14.77
7
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
prithivMLmods/Qwen2.5-14B-DeepSeek-R1-1M
0.5878
0.2149
0.5589
0.2375
0.6927
0.8616
0.852
0.8312
0.7713
0.4579
0.8841
0.1035
0.4464
0.8462
10.795
0.8912
0.9485
14.7693
0.8748
0.5589
0.8823
0.6264
0.7833
0.9473
0.4731
0.6625
0.8389
0.7727
0.835
0.8841
0.8791
0.857
0.7553
0.852
0.2149
0.2952
0.723
0.4543
0.041
0.2869
0.1504
0.0246
0.6848
0.8098
9.6612
0.8159
0.8987
10.3411
0.743
0.6982
2.1385
31.1958
10.3353
0.1035
26.1257
Qwen2ForCausalLM
bfloat16
14.77
7
main
4
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
karakuri-ai/karakuri-lm-32b-thinking-2501-exp
0.6781
0.6727
0.5745
0.3144
0.7827
0.9011
0.932
0.8515
0.7932
0.6046
0.9289
0.1037
0.6254
0.8686
13.1319
0.9119
0.9566
17.5277
0.8874
0.5745
0.9103
0.6724
0.825
0.9562
0.624
0.7628
0.9026
0.8087
0.7575
0.9289
0.895
0.8782
0.8368
0.932
0.6727
0.9779
0.8026
0.5644
0.0265
0.4178
0.1593
0.1046
0.8639
0.8423
13.6707
0.844
0.9099
11.7112
0.7627
0.6974
2.9653
25.6352
10.3669
0.1037
22.3904
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
6
main
4
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
karakuri-ai/karakuri-lm-32b-thinking-2501-exp
0.5942
0.6727
0.2467
0.1709
0.7368
0.8674
0.848
0.8399
0.7286
0.4097
0.9121
0.1037
0.3342
0.8461
11.996
0.9
0.9549
16.4889
0.8861
0.2467
0.9013
0.6034
0.7264
0.9339
0.4766
0.7077
0.8077
0.7854
0.7199
0.9121
0.9029
0.872
0.767
0.848
0.6727
0.9779
0.766
0.4183
0
0.0052
0.0531
0.0066
0.7897
0.7916
9.308
0.8237
0.893
10.028
0.7499
0.6974
2.9653
25.6352
10.3669
0.1037
22.3904
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
6
main
0
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
arcee-ai/Virtuoso-Small-v2
0.6377
0.5241
0.5644
0.2532
0.7374
0.8862
0.886
0.8487
0.7784
0.5297
0.9107
0.0954
0.5445
0.8648
12.9239
0.9076
0.9552
16.9907
0.8867
0.5644
0.9013
0.6178
0.825
0.9571
0.5485
0.7094
0.8636
0.7771
0.8084
0.9107
0.9009
0.8768
0.8002
0.886
0.5241
0.8112
0.7654
0.4963
0.0664
0.3393
0.0088
0.0613
0.7902
0.8273
11.0671
0.841
0.9047
10.862
0.7596
0.6929
2.6328
24.7744
9.5455
0.0954
21.7536
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
21
main
4
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
arcee-ai/Virtuoso-Small-v2
0.5154
0.5261
0.2472
0.15
0.2882
0.8589
0.76
0.8287
0.7416
0.3038
0.8699
0.0954
0.4187
0.8496
10.6801
0.9003
0.9397
16.0981
0.8496
0.2472
0.8943
0.6264
0.8014
0.9392
0.2959
0.5337
0.7469
0.7001
0.833
0.8699
0.8986
0.8705
0.7433
0.76
0.5261
0.8112
0.0427
0.1969
0.0122
0.0124
0
0.0024
0.7228
0.7963
8.2703
0.8205
0.8947
9.4321
0.7444
0.6929
2.6328
24.7744
9.5455
0.0954
21.7536
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
21
main
0
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
arcee-ai/Virtuoso-Medium-v2
0.656
0.4578
0.58
0.3035
0.7863
0.9028
0.924
0.8468
0.8162
0.5849
0.9165
0.0972
0.5935
0.8616
13.3568
0.9015
0.9565
18.3024
0.8873
0.58
0.9073
0.7069
0.8375
0.9643
0.6129
0.7665
0.8944
0.7942
0.8478
0.9165
0.9034
0.8798
0.8368
0.924
0.4578
0.7028
0.8062
0.5482
0.0378
0.3708
0.1593
0.1178
0.8317
0.8313
12.2656
0.835
0.9087
11.5884
0.7634
0.6855
2.9249
24.9206
9.7211
0.0972
21.9868
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
35
main
4
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
arcee-ai/Virtuoso-Medium-v2
0.5379
0.4578
0.111
0.1493
0.6142
0.8769
0.846
0.7328
0.7402
0.3886
0.9027
0.0972
0.4362
0.7556
11.494
0.7076
0.9252
16.5557
0.7979
0.111
0.9036
0.6236
0.7583
0.9374
0.3001
0.5922
0.7983
0.8018
0.7193
0.9027
0.8984
0.8707
0.7898
0.846
0.4578
0.7028
0.6362
0.4295
0.0108
0.0028
0
0.0017
0.7314
0.7318
9.0666
0.6901
0.8929
9.5233
0.7356
0.6855
2.9249
24.9206
9.7211
0.0972
21.9868
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
35
main
0
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/ultiima-14B-v0.4
0.4543
0.6044
0.2373
0.1361
0.0819
0.5405
0.746
0.8263
0.6351
0.3207
0.771
0.0985
0.3759
0.8352
10.8153
0.8769
0.9539
16.5155
0.884
0.2373
0.0005
0.5833
0.7222
0.9294
0.2807
0.1579
0.4191
0.7241
0.727
0.771
0.8878
0.8563
0.6915
0.746
0.6044
0.9317
0.0059
0.3056
0.0027
0.0073
0.0088
0.0004
0.6612
0.7844
8.4463
0.804
0.8921
9.4759
0.7404
0.694
2.6925
25.1396
9.8714
0.0985
21.9871
Qwen2ForCausalLM
float16
14.766
1
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/ultiima-14B-v0.4
0.6473
0.6044
0.5617
0.2422
0.7339
0.8885
0.902
0.8443
0.7809
0.5443
0.9193
0.0985
0.5517
0.8641
12.5425
0.9064
0.9549
17.5272
0.885
0.5617
0.9106
0.658
0.8028
0.9598
0.5382
0.7035
0.8509
0.7771
0.8159
0.9193
0.9052
0.8771
0.7951
0.902
0.6044
0.9317
0.7644
0.5431
0.037
0.3308
0.0088
0.0618
0.7727
0.8255
11.4685
0.8362
0.9018
10.8782
0.7497
0.694
2.6925
25.1396
9.8714
0.0985
21.9871
Qwen2ForCausalLM
float16
14.766
1
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/ultiima-52B
0.5348
0.5643
0.0042
0.1136
0.6794
0.7943
0.764
0.8356
0.7708
0.367
0.9012
0.0882
0.3662
0.8531
12.1426
0.8984
0.9523
16.7983
0.8816
0.0042
0.9063
0.6897
0.8292
0.7006
0.4177
0.6509
0.7896
0.7494
0.796
0.9012
-0.0417
-0.1063
0.7759
0.764
0.5643
0.8434
0.7078
0.3172
0.0158
0.0064
0.0177
0.0058
0.5223
0.8024
8.7688
0.82
0.8961
10.0962
0.7423
0.6847
2.7253
22.3669
8.8124
0.0882
19.8083
Qwen2ForCausalLM
float16
52.268
0
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/ultiima-52B
0.655
0.5643
0.541
0.3003
0.782
0.8912
0.922
0.8449
0.8192
0.5488
0.9032
0.0882
0.5259
0.8613
13.059
0.9
0.9547
17.1149
0.8851
0.541
0.9031
0.75
0.9236
0.9544
0.6232
0.7619
0.8874
0.7273
0.8078
0.9032
0.8948
0.8705
0.8162
0.922
0.5643
0.8434
0.802
0.4972
0.1089
0.4594
0.0177
0.1293
0.7864
0.8332
11.7573
0.8376
0.9062
11.5743
0.7572
0.6847
2.7253
22.3669
8.8124
0.0882
19.8083
Qwen2ForCausalLM
float16
52.268
0
main
4
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/ultiima-14B
0.5511
0.4739
0.2263
0.1325
0.6875
0.8535
0.796
0.8034
0.7461
0.3916
0.8541
0.0977
0.4037
0.8416
9.1854
0.8953
0.9176
15.5024
0.8015
0.2263
0.8933
0.6322
0.8278
0.9276
0.3398
0.6631
0.7691
0.69
0.8114
0.8541
0.8835
0.8549
0.7396
0.796
0.4739
0.7249
0.712
0.4313
0.0252
0.0093
0.0177
0.0008
0.6097
0.7919
7.5855
0.8118
0.8751
8.8554
0.7052
0.6937
2.8272
25.4006
9.7768
0.0977
22.3533
Qwen2ForCausalLM
float16
apache-2.0
14.77
1
main
0
False
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
Sakalti/ultiima-14B
0.6279
0.4739
0.5614
0.247
0.7349
0.8812
0.89
0.8247
0.7584
0.5385
0.8988
0.0977
0.5095
0.861
12.4783
0.9057
0.9153
16.5399
0.8025
0.5614
0.8988
0.6236
0.8264
0.9535
0.575
0.7052
0.8246
0.7588
0.7585
0.8988
0.8871
0.8576
0.7914
0.89
0.4739
0.7249
0.7646
0.531
0.0956
0.329
0.0147
0.0817
0.7138
0.8215
9.9945
0.8347
0.9019
10.5392
0.7559
0.6937
2.8272
25.4006
9.7768
0.0977
22.3533
Qwen2ForCausalLM
float16
apache-2.0
14.77
1
main
4
False
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwenvergence-14B-v11
0.5595
0.6486
0.2143
0.1554
0.6992
0.8533
0.826
0.8333
0.7315
0.3195
0.7664
0.1066
0.3574
0.8403
9.4399
0.893
0.9515
15.2957
0.8812
0.2143
0.9001
0.6063
0.8
0.9303
0.2603
0.6637
0.7695
0.6723
0.8092
0.7664
0.8847
0.8589
0.7297
0.826
0.6486
0.9558
0.7347
0.3406
0.032
0.0096
0.0354
0.003
0.6969
0.7954
7.8429
0.8144
0.8953
9.1788
0.7446
0.7016
2.6131
27.828
10.6598
0.1066
24.3235
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
3
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwenvergence-14B-v11
0.6465
0.6486
0.5768
0.2724
0.7302
0.8753
0.898
0.8442
0.7571
0.5182
0.8846
0.1066
0.4992
0.8593
12.0589
0.9048
0.9521
16.1279
0.8816
0.5768
0.8925
0.5862
0.7847
0.9473
0.5776
0.6967
0.8492
0.767
0.7983
0.8846
0.8817
0.8624
0.7861
0.898
0.6486
0.9558
0.7636
0.4779
0.0697
0.366
0.0796
0.0744
0.7724
0.822
10.2406
0.8329
0.9037
10.7197
0.7574
0.7016
2.6131
27.828
10.6598
0.1066
24.3235
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
3
main
4
True
v1.4.1
v0.6.3.post1
🟦 : RL-tuned (Preference optimization)
FINGU-AI/FINGU-2.5-instruct-32B
0.5987
0.01
0.5797
0.2726
0.7522
0.8863
0.92
0.847
0.7714
0.5285
0.9117
0.1068
0.5442
0.8637
12.9007
0.9096
0.9535
17.1287
0.8847
0.5797
0.8978
0.6667
0.7625
0.9535
0.5294
0.7283
0.8628
0.8011
0.7638
0.9117
0.9026
0.8746
0.8077
0.92
0.01
0.0462
0.7762
0.5118
0.0056
0.362
0.1504
0.0662
0.7789
0.8256
10.5227
0.8371
0.903
10.8504
0.7565
0.6963
2.4058
29.1618
10.6646
0.1068
21.2655
Qwen2ForCausalLM
bfloat16
mit
32.764
1
main
4
True
v1.4.1
v0.6.3.post1
🟦 : RL-tuned (Preference optimization)
FINGU-AI/FINGU-2.5-instruct-32B
0.2688
0.01
0.2323
0.1316
0
0.6705
0
0.7894
0.2742
0.1868
0.5551
0.1068
0.2031
0.8218
11.0378
0.8566
0.9341
14.3631
0.8308
0.2323
0.3332
0.5862
0.0014
0.9231
0.137
0
0.394
0.0101
0.3791
0.5551
0.889
0.863
0.7553
0
0.01
0.0462
0
0.2203
0.0099
0.0186
0.0354
0.0037
0.5904
0.7682
8.1401
0.7702
0.8827
8.569
0.7
0.6963
2.4058
29.1618
10.6646
0.1068
21.2655
Qwen2ForCausalLM
bfloat16
mit
32.764
1
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwenvergence-14B-v12-Prose-DS
0.6495
0.6145
0.5662
0.2586
0.7452
0.8883
0.9
0.8465
0.7729
0.5289
0.9125
0.1109
0.5403
0.8619
12.5295
0.907
0.9529
16.4442
0.8825
0.5662
0.9033
0.6379
0.8028
0.958
0.5724
0.7176
0.8665
0.7835
0.7737
0.9125
0.9023
0.8719
0.8036
0.9
0.6145
0.9859
0.7728
0.474
0.0655
0.3733
0.0088
0.0598
0.7856
0.8293
11.4436
0.8383
0.9049
11.1836
0.7581
0.7022
2.9257
29.1169
11.0881
0.1109
25.3769
Qwen2ForCausalLM
bfloat16
14.766
5
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwenvergence-14B-v12-Prose-DS
0.5356
0.6145
0.1722
0.1472
0.391
0.8585
0.808
0.8408
0.7421
0.3312
0.8758
0.1109
0.4125
0.8496
10.5574
0.9016
0.9536
15.9425
0.8844
0.1722
0.8955
0.6063
0.8056
0.941
0.3788
0.6131
0.7629
0.7292
0.8066
0.8758
0.8912
0.857
0.7389
0.808
0.6145
0.9859
0.1689
0.2021
0.0199
0.005
0
0.004
0.7071
0.8031
8.4726
0.8259
0.8968
9.6732
0.7514
0.7022
2.9257
29.1169
11.0881
0.1109
25.3769
Qwen2ForCausalLM
bfloat16
14.766
5
main
0
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
NaniDAO/deepseek-r1-qwen-2.5-32B-ablated
0.5889
0.01
0.5683
0.263
0.7432
0.8759
0.924
0.8431
0.7581
0.5187
0.8727
0.1004
0.5147
0.86
11.9322
0.9056
0.9526
16.6474
0.8828
0.5683
0.88
0.6552
0.7278
0.9473
0.5343
0.7148
0.8492
0.7879
0.7702
0.8727
0.9073
0.8768
0.8005
0.924
0.01
0.0522
0.7716
0.5072
0.014
0.3687
0.1416
0.036
0.7546
0.8188
10.2842
0.8273
0.9037
10.9257
0.7567
0.6898
2.3485
27.255
10.05
0.1004
20.2093
Qwen2ForCausalLM
bfloat16
mit
32.764
37
main
4
True
v1.4.1
v0.6.3.post1
πŸ”Ά : fine-tuned
NaniDAO/deepseek-r1-qwen-2.5-32B-ablated
0.2896
0.01
0.2342
0.1155
0
0.8147
0
0.7812
0.4286
0.164
0.5368
0.1004
0.1503
0.8135
10.3849
0.8446
0.9304
14.1299
0.8253
0.2342
0.8014
0.5632
0.0653
0.9071
0.1517
0
0.636
0.2134
0.6649
0.5368
0.8788
0.8558
0.7357
0
0.01
0.0522
0
0.1899
0.0116
0.008
0.0295
0.0068
0.5215
0.7599
7.6814
0.7582
0.8827
8.7336
0.6967
0.6898
2.3485
27.255
10.05
0.1004
20.2093
Qwen2ForCausalLM
bfloat16
mit
32.764
37
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwenvergence-14B-v13-Prose-DS
0.6484
0.6225
0.5747
0.2761
0.7387
0.8845
0.896
0.8437
0.7635
0.5274
0.8943
0.1109
0.5107
0.8582
11.9407
0.9037
0.9518
16.6065
0.881
0.5747
0.9008
0.5977
0.8042
0.9553
0.5637
0.7097
0.8521
0.779
0.7847
0.8943
0.8902
0.8682
0.7973
0.896
0.6225
0.992
0.7676
0.5079
0.0817
0.3696
0.0885
0.0632
0.7774
0.8239
10.6053
0.833
0.9036
10.8802
0.7571
0.7057
2.7809
28.684
11.0875
0.1109
25.0232
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
5
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
sometimesanotion/Qwenvergence-14B-v13-Prose-DS
0.5702
0.6225
0.2118
0.1587
0.7056
0.8583
0.836
0.8344
0.7472
0.3481
0.8385
0.1109
0.3785
0.8432
9.945
0.8934
0.9523
16.0119
0.8829
0.2118
0.892
0.6293
0.8056
0.9374
0.3479
0.6701
0.7629
0.7165
0.8216
0.8385
0.891
0.8561
0.7453
0.836
0.6225
0.992
0.7411
0.3179
0.0222
0.0104
0.0619
0
0.699
0.7963
7.8284
0.8125
0.8961
9.5003
0.7488
0.7057
2.7809
28.684
11.0875
0.1109
25.0232
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
5
main
0
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
CultriX/Qwen2.5-14B-Hyperionv4
0.6497
0.6165
0.5639
0.2558
0.7393
0.8882
0.906
0.8475
0.7781
0.5405
0.9085
0.1027
0.5485
0.8653
13.1627
0.9095
0.9544
17.1089
0.8842
0.5639
0.9063
0.6351
0.8194
0.958
0.557
0.7114
0.8652
0.7778
0.7928
0.9085
0.8987
0.8721
0.8003
0.906
0.6165
0.9739
0.7672
0.5159
0.0494
0.357
0.0088
0.0662
0.7975
0.8295
11.3193
0.8403
0.9047
11.1805
0.756
0.6936
2.6947
27.4441
10.2671
0.1027
23.8588
Qwen2ForCausalLM
bfloat16
14.766
2
main
4
True
v1.4.1
v0.6.3.post1
🀝 : base merges and moerges
CultriX/Qwen2.5-14B-Hyperionv4
0.5506
0.6165
0.2202
0.1518
0.4392
0.8534
0.82
0.8408
0.7415
0.39
0.8805
0.1027
0.458
0.8514
10.4939
0.9041
0.9538
16.0908
0.8842
0.2202
0.8888
0.6264
0.8014
0.9339
0.3643
0.6507
0.7428
0.7102
0.8267
0.8805
0.8906
0.8577
0.7376
0.82
0.6165
0.9739
0.2277
0.3477
0.0183
0.0166
0
0
0.724
0.8021
8.2297
0.8257
0.8961
9.2811
0.7491
0.6936
2.6947
27.4441
10.2671
0.1027
23.8588
Qwen2ForCausalLM
bfloat16
14.766
2
main
0
True
v1.4.1
v0.6.3.post1