model_type
stringclasses
5 values
model
stringlengths
12
62
AVG
float64
0.03
0.7
CG
float64
0
0.68
EL
float64
0
0.77
FA
float64
0
0.62
HE
float64
0
0.83
MC
float64
0
0.95
MR
float64
0
0.95
MT
float64
0.19
0.86
NLI
float64
0
0.97
QA
float64
0
0.77
RC
float64
0
0.94
SUM
float64
0
0.29
aio_char_f1
float64
0
0.9
alt-e-to-j_bert_score_ja_f1
float64
0
0.88
alt-e-to-j_bleu_ja
float64
0
16
alt-e-to-j_comet_wmt22
float64
0.2
0.92
alt-j-to-e_bert_score_en_f1
float64
0
0.96
alt-j-to-e_bleu_en
float64
0
20.1
alt-j-to-e_comet_wmt22
float64
0.17
0.89
chabsa_set_f1
float64
0
0.77
commonsensemoralja_exact_match
float64
0
0.94
jamp_exact_match
float64
0
1
janli_exact_match
float64
0
1
jcommonsenseqa_exact_match
float64
0
0.98
jemhopqa_char_f1
float64
0
0.71
jmmlu_exact_match
float64
0
0.81
jnli_exact_match
float64
0
0.94
jsem_exact_match
float64
0
0.96
jsick_exact_match
float64
0
0.93
jsquad_char_f1
float64
0
0.94
jsts_pearson
float64
-0.35
0.94
jsts_spearman
float64
-0.6
0.91
kuci_exact_match
float64
0
0.93
mawps_exact_match
float64
0
0.95
mbpp_code_exec
float64
0
0.68
mbpp_pylint_check
float64
0
0.99
mmlu_en_exact_match
float64
0
0.86
niilc_char_f1
float64
0
0.7
wiki_coreference_set_f1
float64
0
0.4
wiki_dependency_set_f1
float64
0
0.88
wiki_ner_set_f1
float64
0
0.33
wiki_pas_set_f1
float64
0
0.57
wiki_reading_char_f1
float64
0
0.94
wikicorpus-e-to-j_bert_score_ja_f1
float64
0
0.88
wikicorpus-e-to-j_bleu_ja
float64
0
24
wikicorpus-e-to-j_comet_wmt22
float64
0.18
0.87
wikicorpus-j-to-e_bert_score_en_f1
float64
0
0.93
wikicorpus-j-to-e_bleu_en
float64
0
15.9
wikicorpus-j-to-e_comet_wmt22
float64
0.17
0.79
xlsum_ja_bert_score_ja_f1
float64
0
0.79
xlsum_ja_bleu_ja
float64
0
10.2
xlsum_ja_rouge1
float64
0
52.8
xlsum_ja_rouge2
float64
0
29.2
xlsum_ja_rouge2_scaling
float64
0
0.29
xlsum_ja_rougeLsum
float64
0
44.9
architecture
stringclasses
12 values
precision
stringclasses
3 values
license
stringclasses
14 values
params
float64
0
70.6
likes
int64
0
6.19k
revision
stringclasses
1 value
num_few_shot
int64
0
4
add_special_tokens
stringclasses
2 values
llm_jp_eval_version
stringclasses
1 value
vllm_version
stringclasses
1 value
⭕ : instruction-tuned
meta-llama/Meta-Llama-3-8B-Instruct
0.2164
0
0.0052
0.061
0.0016
0.3121
0.022
0.764
0.3398
0.1971
0.664
0.0138
0.1527
0.8169
8.0639
0.8448
0.9364
11.7792
0.8494
0.0052
0
0.4368
0.5681
0.5898
0.2475
0
0.2999
0.0038
0.3905
0.664
0.7159
0.7167
0.3465
0.022
0
0.012
0.0033
0.191
0
0.0038
0
0
0.301
0.7386
5.917
0.7027
0.874
8.2024
0.659
0.6158
0.341
5.7664
1.371
0.0138
5.064
LlamaForCausalLM
bfloat16
llama3
8.03
3,974
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
meta-llama/Meta-Llama-3-8B-Instruct
0.4962
0
0.4758
0.2329
0.5383
0.753
0.732
0.8226
0.6059
0.3892
0.8946
0.0138
0.3947
0.8444
10.3245
0.8826
0.9437
14.5237
0.8623
0.4758
0.7801
0.5144
0.6431
0.8767
0.3812
0.4668
0.6109
0.6439
0.6172
0.8946
0.8216
0.8015
0.6024
0.732
0
0.012
0.6098
0.3918
0.0189
0.3115
0.0796
0.0805
0.674
0.8115
8.9124
0.8116
0.8951
9.6677
0.7337
0.6158
0.341
5.7664
1.371
0.0138
5.064
LlamaForCausalLM
bfloat16
llama3
8.03
3,974
main
4
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
meta-llama/Llama-3.2-3B
0.2124
0.1707
0.0098
0.0394
0.0288
0.3178
0.016
0.6915
0.4239
0.1329
0.4803
0.0256
0.1056
0.7762
6.4825
0.7696
0.9114
9.3192
0.803
0.0098
0.5303
0.3793
0.5
0.1716
0.1396
0.0364
0.3016
0.6414
0.2971
0.4803
0
0
0.2515
0.016
0.1707
0.5141
0.0211
0.1536
0
0.002
0
0
0.1951
0.7016
4.4551
0.6147
0.8282
4.6005
0.5786
0.5844
1.066
12.6667
2.5479
0.0256
10.9324
LlamaForCausalLM
bfloat16
llama3.2
3.213
568
main
0
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
meta-llama/Llama-3.2-3B
0.4225
0.1707
0.3862
0.1803
0.409
0.5501
0.616
0.7875
0.3375
0.3568
0.828
0.0256
0.3136
0.8282
9.0363
0.8587
0.9381
13.7033
0.8508
0.3862
0.7347
0.3391
0.5264
0.5898
0.4377
0.3423
0.3053
0.3327
0.1841
0.828
0.0481
0.0889
0.3259
0.616
0.1707
0.5141
0.4758
0.3191
0.0106
0.217
0.0177
0.0543
0.6019
0.7744
8.4631
0.7492
0.8819
9.0517
0.6912
0.5844
1.066
12.6667
2.5479
0.0256
10.9324
LlamaForCausalLM
bfloat16
llama3.2
3.213
568
main
4
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
meta-llama/Llama-3.2-1B
0.2788
0.1044
0.2863
0.0576
0.254
0.3175
0.158
0.6971
0.3619
0.2338
0.5554
0.0405
0.1684
0.7838
6.7944
0.7733
0.9181
11.059
0.7996
0.2863
0.493
0.3563
0.4986
0.2082
0.3274
0.2448
0.5559
0.1667
0.2322
0.5554
-0.1145
-0.1219
0.2514
0.158
0.1044
0.5964
0.2631
0.2056
0.0039
0.0802
0
0.0196
0.1842
0.7014
6.145
0.6129
0.8536
6.4619
0.6024
0.5952
1.1235
12.2139
4.0405
0.0405
10.3837
LlamaForCausalLM
bfloat16
llama3.2
1.236
1,910
main
4
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
meta-llama/Llama-3.2-1B
0.1055
0.1044
0
0.0343
0.0489
0.115
0.006
0.4856
0
0.1045
0.2211
0.0405
0.0655
0.6909
0.4139
0.5445
0.7509
0.2066
0.5013
0
0.1906
0
0
0.0009
0.1438
0.0124
0
0
0
0.2211
-0.0273
-0.0221
0.1533
0.006
0.1044
0.5964
0.0853
0.1042
0
0
0
0
0.1714
0.6585
0.6969
0.4661
0.7447
0.2973
0.4303
0.5952
1.1235
12.2139
4.0405
0.0405
10.3837
LlamaForCausalLM
bfloat16
llama3.2
1.236
1,910
main
0
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
meta-llama/Llama-3.1-70B
0.5941
0.0141
0.5255
0.2959
0.6845
0.8791
0.88
0.8556
0.7397
0.6487
0.9229
0.089
0.7296
0.8715
13.7469
0.911
0.9583
18.699
0.8902
0.5255
0.9063
0.5977
0.7889
0.9419
0.6145
0.6351
0.7506
0.7961
0.765
0.9229
0.9018
0.8696
0.7891
0.88
0.0141
0.0281
0.7339
0.602
0.0565
0.3599
0.1239
0.0671
0.872
0.8556
15.4783
0.8542
0.9143
12.6925
0.767
0.6861
2.1497
22.176
8.9001
0.089
19.3898
LlamaForCausalLM
bfloat16
llama3.1
70.554
366
main
4
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
meta-llama/Llama-3.1-70B
0.4186
0.0141
0.3199
0.1591
0.2133
0.5708
0.494
0.838
0.6159
0.3977
0.8933
0.089
0.4266
0.8508
11.455
0.9003
0.9533
16.5452
0.8831
0.3199
0.7375
0.4856
0.6778
0.5541
0.3359
0.1065
0.4281
0.7544
0.7335
0.8933
0.69
0.7523
0.421
0.494
0.0141
0.0281
0.3202
0.4306
0
0.0179
0.0177
0.0024
0.7572
0.8049
8.9761
0.8185
0.9031
10.4305
0.75
0.6861
2.1497
22.176
8.9001
0.089
19.3898
LlamaForCausalLM
bfloat16
llama3.1
70.554
366
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
google/gemma-1.1-7b-it
0.149
0
0.0082
0.0191
0.2669
0.3215
0.056
0.2866
0.4414
0.0915
0.1224
0.0257
0.0225
0.6226
0.0924
0.3582
0.8167
3.2642
0.3855
0.0082
0.4679
0.3391
0.5
0.2502
0.1948
0.2776
0.5534
0.5934
0.2212
0.1224
0.0158
-0.0669
0.2464
0.056
0
0
0.2562
0.0573
0
0
0.0177
0
0.0779
0.5129
0.0698
0.1785
0.765
2.6153
0.2243
0.5724
1.4248
14.2917
2.5693
0.0257
11.9555
GemmaForCausalLM
bfloat16
gemma
8.538
273
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
google/gemma-1.1-7b-it
0.0655
0
0
0.0167
0.0089
0.0909
0
0.3642
0
0.0603
0.1541
0.0257
0.0331
0.6638
0.2425
0.4453
0.7408
0.3664
0.3466
0
0.0075
0
0
0.0715
0.0847
0
0
0
0
0.1541
-0.0452
-0.041
0.1937
0
0
0
0.0179
0.063
0
0
0
0
0.0835
0.5913
0.6693
0.3503
0.724
0.5251
0.3145
0.5724
1.4248
14.2917
2.5693
0.0257
11.9555
GemmaForCausalLM
bfloat16
gemma
8.538
273
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
google/gemma-1.1-2b-it
0.0952
0
0
0.0247
0.0468
0.078
0.006
0.4931
0.0017
0.061
0.2914
0.0446
0.0499
0.6982
0.2783
0.548
0.7591
0.3118
0.5581
0
0
0
0.0083
0.0009
0.0443
0.0799
0
0
0
0.2914
0
0
0.2332
0.006
0
0
0.0136
0.0887
0
0
0
0
0.1235
0.6541
0.597
0.4451
0.7534
0.5065
0.421
0.6555
1.0033
24.5415
4.4743
0.0446
19.6346
GemmaForCausalLM
bfloat16
gemma
2.506
159
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
google/gemma-1.1-2b-it
0.2876
0
0.2493
0.0624
0.3238
0.3532
0.328
0.5771
0.3758
0.2095
0.6401
0.0446
0.1351
0.7513
4.6594
0.6695
0.8988
9.3482
0.7202
0.2493
0.4647
0.3333
0.5
0.3432
0.3782
0.3067
0.3164
0.5676
0.1618
0.6401
0.0602
0.074
0.2517
0.328
0
0
0.341
0.1153
0.0063
0.0289
0.0531
0.0131
0.2107
0.5938
0.1743
0.3774
0.8342
4.955
0.5412
0.6555
1.0033
24.5415
4.4743
0.0446
19.6346
GemmaForCausalLM
bfloat16
gemma
2.506
159
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
google/gemma-2b-it
0.274
0
0.2404
0.0464
0.3186
0.3678
0.238
0.552
0.4062
0.1644
0.631
0.0493
0.1191
0.7334
3.6557
0.6267
0.8865
7.8681
0.6728
0.2404
0.4749
0.3333
0.5
0.378
0.3014
0.2999
0.4671
0.5688
0.1618
0.631
0
0
0.2505
0.238
0
0
0.3373
0.0728
0.0033
0.0162
0.0265
0.0083
0.1778
0.6264
0.1199
0.4039
0.8268
4.9351
0.5046
0.6583
1.183
25.2621
4.9277
0.0493
19.8895
GemmaForCausalLM
bfloat16
gemma
2.506
744
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
google/gemma-2b-it
0.0954
0
0
0.0228
0.0002
0.1768
0
0.4571
0.0721
0.0572
0.2137
0.0493
0.0453
0.6949
0.2568
0.5367
0.7531
0.1883
0.4722
0
0.384
0
0.3458
0
0.0324
0.0003
0.0148
0
0
0.2137
-0.0554
-0.0492
0.1464
0
0
0
0.0001
0.0939
0
0
0
0
0.114
0.6539
0.4013
0.4424
0.7482
0.1519
0.377
0.6583
1.183
25.2621
4.9277
0.0493
19.8895
GemmaForCausalLM
bfloat16
gemma
2.506
744
main
0
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
google/gemma-7b
0.0383
0
0
0.0005
0.0004
0.1606
0
0.2273
0
0.0082
0.0241
0
0.0008
0.4752
0.2585
0.203
0.3384
0.4239
0.2719
0
0.0003
0
0
0.2386
0.0147
0.0006
0
0
0
0.0241
0.0017
0.0096
0.2429
0
0
0.8414
0.0002
0.0089
0
0
0
0
0.0027
0.4224
0.5033
0.188
0.4309
0.542
0.2463
0.2849
0.0329
0.0152
0
0
0.0149
GemmaForCausalLM
bfloat16
gemma
8.538
3,168
main
0
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
google/gemma-7b
0.036
0
0
0.0005
0
0.1547
0.002
0.186
0.009
0.0105
0.0327
0
0.0005
0.4809
0.131
0.2199
0.6864
2.4121
0.1721
0
0.0008
0.0057
0
0.2154
0.0279
0
0.0008
0.0328
0.0057
0.0327
-0.0083
-0.009
0.248
0.002
0
0.8414
0
0.003
0
0
0
0
0.0026
0.4446
0.1832
0.1822
0.6847
3.6753
0.17
0.2849
0.0329
0.0152
0
0
0.0149
GemmaForCausalLM
bfloat16
gemma
8.538
3,168
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
google/gemma-7b-it
0.045
0
0
0.0212
0.0001
0
0
0.3439
0
0.0387
0.0606
0.03
0.0244
0.6495
0.1817
0.4037
0.7356
0.0919
0.3215
0
0
0
0
0
0.0627
0
0
0
0
0.0606
0.0047
0.0112
0
0
0
0.0361
0.0002
0.0289
0
0
0
0
0.1062
0.6091
0.3699
0.3527
0.7188
0.1897
0.2978
0.6181
1.961
18.9243
3.009
0.03
15.5497
GemmaForCausalLM
bfloat16
gemma
8.538
1,174
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
google/gemma-7b-it
0.1498
0
0.004
0.0234
0.2555
0.3292
0.01
0.3307
0.5303
0.0721
0.0628
0.03
0.0463
0.6154
0.0374
0.3252
0.842
5.3864
0.4182
0.004
0.522
0.3276
0.4819
0.2154
0.125
0.2409
0.5534
0.6717
0.617
0.0628
-0.0171
-0.0193
0.2501
0.01
0
0.0361
0.2701
0.0451
0
0
0.0088
0
0.108
0.564
0.0654
0.2756
0.757
1.9916
0.3038
0.6181
1.961
18.9243
3.009
0.03
15.5497
GemmaForCausalLM
bfloat16
gemma
8.538
1,174
main
4
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
google/gemma-2b
0.2882
0
0.3342
0.0775
0.3135
0.3492
0.23
0.5792
0.3935
0.1294
0.7133
0.0506
0.0673
0.7181
6.7048
0.6014
0.9083
10.2007
0.7617
0.3342
0.5862
0.3333
0.5
0.2109
0.2467
0.2923
0.3279
0.6471
0.1593
0.7133
0.05
0.0564
0.2506
0.23
0
0.3273
0.3348
0.0741
0.0009
0.0781
0.0531
0.029
0.2265
0.6219
1.4302
0.4864
0.8032
4.9462
0.4672
0.6424
1.8859
19.7977
5.0582
0.0506
16.1766
GemmaForCausalLM
bfloat16
gemma
2.506
998
main
4
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
google/gemma-2b
0.0693
0
0
0.0349
0
0
0
0.4957
0
0.0359
0.1455
0.0506
0.0361
0.6923
0.235
0.5342
0.7523
0.3109
0.5812
0
0
0
0
0
0.0443
0
0
0
0
0.1455
0.0506
0.0499
0
0
0
0.3273
0
0.0272
0
0
0
0
0.1746
0.638
0.4192
0.4291
0.7161
0.3383
0.4384
0.6424
1.8859
19.7977
5.0582
0.0506
16.1766
GemmaForCausalLM
bfloat16
gemma
2.506
998
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
meta-llama/Llama-3.3-70B-Instruct
0.3749
0.0281
0.1971
0.1527
0.2179
0.849
0.026
0.8351
0.6637
0.42
0.6432
0.0918
0.4565
0.8626
13.3468
0.9045
0.956
17.4829
0.886
0.1971
0.9003
0.5891
0.5181
0.9035
0.4258
0.1053
0.597
0.791
0.8232
0.6432
0.8824
0.853
0.7431
0.026
0.0281
0.0522
0.3304
0.3777
0.0032
0.0266
0.0442
0.0023
0.6871
0.8091
10.7789
0.8156
0.8966
11.1103
0.7344
0.6878
3.1368
22.6097
9.1828
0.0918
20.1245
LlamaForCausalLM
bfloat16
llama3.3
70.554
2,335
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
meta-llama/Llama-3.3-70B-Instruct
0.6145
0.0281
0.5723
0.2659
0.7714
0.8874
0.944
0.8519
0.8016
0.6412
0.9042
0.0918
0.7084
0.8671
13.5853
0.9095
0.9577
18.1284
0.888
0.5723
0.9121
0.6724
0.9125
0.9455
0.6528
0.7266
0.8114
0.7942
0.8175
0.9042
0.8833
0.8483
0.8046
0.944
0.0281
0.0522
0.8162
0.5625
0.0418
0.3363
0.0442
0.0568
0.8501
0.8502
15.8625
0.8517
0.9089
12.3046
0.7584
0.6878
3.1368
22.6097
9.1828
0.0918
20.1245
LlamaForCausalLM
bfloat16
llama3.3
70.554
2,335
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
meta-llama/Meta-Llama-3-70B-Instruct
0.6257
0.2892
0.5891
0.2922
0.7256
0.8576
0.922
0.8509
0.7307
0.6125
0.9167
0.0957
0.673
0.8644
12.538
0.9044
0.9544
16.8976
0.8836
0.5891
0.881
0.6839
0.7194
0.9419
0.6102
0.6721
0.6873
0.779
0.7836
0.9167
0.8834
0.8525
0.7499
0.922
0.2892
0.4177
0.7792
0.5544
0.0416
0.3764
0.1239
0.0798
0.8393
0.8346
11.3104
0.8471
0.9097
11.0488
0.7685
0.6943
2.0042
24.8641
9.5922
0.0957
21.5437
LlamaForCausalLM
bfloat16
llama3
70.554
1,472
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
meta-llama/Meta-Llama-3-70B-Instruct
0.5107
0.2892
0.242
0.1712
0.5743
0.7024
0.756
0.8417
0.6258
0.4434
0.8756
0.0957
0.484
0.8584
11.9451
0.9046
0.9475
15.5203
0.8733
0.242
0.7365
0.5345
0.6097
0.7909
0.381
0.5304
0.4737
0.7544
0.7566
0.8756
0.8678
0.8315
0.5799
0.756
0.2892
0.4177
0.6183
0.4652
0
0.0073
0.0531
0.0082
0.7874
0.8105
8.8033
0.8318
0.9034
10.031
0.7574
0.6943
2.0042
24.8641
9.5922
0.0957
21.5437
LlamaForCausalLM
bfloat16
llama3
70.554
1,472
main
0
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
meta-llama/Meta-Llama-3-70B
0.6273
0.2992
0.5339
0.2942
0.718
0.8799
0.948
0.8531
0.7315
0.6465
0.9212
0.0745
0.7347
0.8708
13.9748
0.9088
0.9581
18.9863
0.8888
0.5339
0.9013
0.5891
0.7806
0.9473
0.6056
0.6919
0.7079
0.7942
0.7857
0.9212
0.8903
0.855
0.7911
0.948
0.2992
0.4659
0.7441
0.5991
0.0306
0.3763
0.1416
0.0545
0.8682
0.8584
16.1156
0.8554
0.9138
12.5835
0.7594
0.6697
2.0276
18.0392
7.4331
0.0745
15.9677
LlamaForCausalLM
bfloat16
llama3
70.554
859
main
4
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
meta-llama/Meta-Llama-3-70B
0.5012
0.2992
0.3387
0.162
0.4276
0.6513
0.742
0.8352
0.603
0.4888
0.8908
0.0745
0.5411
0.8503
11.5686
0.898
0.9544
17.117
0.884
0.3387
0.7928
0.4626
0.6528
0.6765
0.4114
0.3849
0.3936
0.7475
0.7587
0.8908
0.7909
0.8021
0.4846
0.742
0.2992
0.4659
0.4702
0.5138
0
0.0193
0.0177
0.0064
0.7668
0.81
9.5953
0.8131
0.9041
10.688
0.7458
0.6697
2.0276
18.0392
7.4331
0.0745
15.9677
LlamaForCausalLM
bfloat16
llama3
70.554
859
main
0
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
meta-llama/Meta-Llama-3-8B
0.4886
0.0181
0.4059
0.2451
0.5186
0.668
0.704
0.8244
0.6068
0.4325
0.8889
0.0625
0.4481
0.8511
11.7555
0.8829
0.9494
15.7335
0.8733
0.4059
0.6568
0.454
0.5986
0.8293
0.4521
0.4473
0.6085
0.7317
0.6414
0.8889
0.7472
0.7306
0.5178
0.704
0.0181
0.0502
0.5899
0.3972
0.0084
0.3696
0.0531
0.0536
0.7409
0.8146
11.0876
0.8053
0.9005
11.05
0.7361
0.6283
1.7042
15.2962
6.2471
0.0625
13.5321
LlamaForCausalLM
bfloat16
llama3
8.03
6,187
main
4
False
v1.4.1
v0.6.3.post1
🟢 : pretrained
meta-llama/Meta-Llama-3-8B
0.293
0.0181
0.0194
0.0514
0.2166
0.3855
0.274
0.7839
0.3773
0.2665
0.7677
0.0625
0.265
0.8269
9.5102
0.861
0.9423
13.0823
0.8607
0.0194
0.5529
0.3276
0.5
0.3342
0.2906
0.1401
0.1619
0.6566
0.2403
0.7677
0.346
0.3449
0.2695
0.274
0.0181
0.0502
0.2932
0.2438
0
0.0006
0
0
0.2562
0.7569
7.1002
0.7336
0.8746
8.1538
0.6805
0.6283
1.7042
15.2962
6.2471
0.0625
13.5321
LlamaForCausalLM
bfloat16
llama3
8.03
6,187
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
llm-jp/llm-jp-3.1-13b-instruct4
0.6489
0.0281
0.7513
0.6006
0.5413
0.9395
0.72
0.8517
0.9625
0.6931
0.9427
0.1075
0.8045
0.8717
14.9427
0.9028
0.947
17.8751
0.864
0.7513
0.9376
0.9971
1
0.9598
0.6372
0.5267
0.9433
0.9609
0.9111
0.9427
0.9352
0.9071
0.9212
0.72
0.0281
0.0643
0.556
0.6375
0.3544
0.879
0.3274
0.4999
0.9421
0.8683
21.9099
0.8555
0.9249
15.3226
0.7845
0.7001
2.6706
31.4294
10.749
0.1075
26.3431
LlamaForCausalLM
bfloat16
apache-2.0
13.708
3
main
4
True
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
llm-jp/llm-jp-3.1-13b-instruct4
0.6525
0.0281
0.7652
0.6207
0.53
0.945
0.728
0.8501
0.965
0.6929
0.9448
0.1075
0.8068
0.8624
15.6982
0.8911
0.9443
18.4501
0.8552
0.7652
0.9384
0.9971
1
0.9705
0.6352
0.5078
0.94
0.9621
0.9255
0.9448
0.9375
0.9092
0.9262
0.728
0.0281
0.0643
0.5523
0.6368
0.4014
0.882
0.3097
0.5677
0.9425
0.8767
24.0066
0.864
0.9276
15.8636
0.7902
0.7001
2.6706
31.4294
10.749
0.1075
26.3431
LlamaForCausalLM
bfloat16
apache-2.0
13.708
3
main
0
True
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
llm-jp/llm-jp-3.1-1.8b-instruct4
0.5788
0
0.6167
0.5125
0.4345
0.8995
0.56
0.8458
0.9207
0.5653
0.924
0.0876
0.6161
0.8691
14.7817
0.9002
0.9481
15.9402
0.8705
0.6167
0.8833
0.9368
1
0.9294
0.5416
0.4293
0.908
0.8737
0.8851
0.924
0.9244
0.8961
0.8858
0.56
0
0
0.4398
0.5382
0.2833
0.828
0.1416
0.4029
0.9066
0.8568
19.6487
0.8452
0.9167
13.916
0.7672
0.6856
2.2083
25.6171
8.7508
0.0876
21.5264
LlamaForCausalLM
bfloat16
apache-2.0
1.868
1
main
4
True
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
llm-jp/llm-jp-3.1-1.8b-instruct4
0.5788
0
0.6494
0.5384
0.4216
0.9041
0.506
0.8498
0.9244
0.5596
0.9257
0.0876
0.6106
0.8701
14.1502
0.9003
0.9503
17.1237
0.8734
0.6494
0.89
0.9339
1
0.9339
0.5337
0.414
0.9092
0.8782
0.9005
0.9257
0.9267
0.8975
0.8885
0.506
0
0
0.4293
0.5346
0.2777
0.8565
0.2035
0.4376
0.9166
0.8618
20.4743
0.8499
0.9215
14.6513
0.7756
0.6856
2.2083
25.6171
8.7508
0.0876
21.5264
LlamaForCausalLM
bfloat16
apache-2.0
1.868
1
main
0
True
v1.4.1
v0.6.3.post1