teruo6939 commited on
Commit
504070b
·
verified ·
1 Parent(s): 76513cc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -10
README.md CHANGED
@@ -61,16 +61,16 @@ and accuracy is calculated as the proportion of questions whose output exactly m
61
 
62
  | Model | Micro-average | culture | custom | regional identity | geography | history | government | law | healthcare |
63
  |:---|---:|---:|---:|---:|---:|---:|---:|---:|---:|
64
- | [sarashina2-8x70b](https://huggingface.co/sbintuitions/sarashina2-8x70b) | 0.7364 | 0.7220 | 0.8088 | 0.7855 | 0.6522 | 0.7839 | 0.7719 | 0.6436 | 0.8462 |
65
- | [sarashina2-70b](https://huggingface.co/sbintuitions/sarashina2-70b) | 0.7245 | 0.6988 | 0.7892 | 0.7556 | 0.6558 | 0.7781 | 0.7544 | 0.6733 | 0.7885 |
66
- | [Llama-3.3-Swallow-70B-v0.4](https://huggingface.co/tokyotech-llm/Llama-3.3-Swallow-70B-v0.4) | 0.6950 | 0.6894 | 0.7353 | 0.6185 | 0.5688 | 0.7781 | 0.7719 | 0.7459 | 0.8462 |
67
- | [RakutenAI-2.0-8x7B](https://huggingface.co/Rakuten/RakutenAI-2.0-8x7B) | 0.6160 | 0.6056 | 0.6814 | 0.6160 | 0.4855 | 0.6888 | 0.6754 | 0.5941 | 0.6923 |
68
- | [Mixtral-8x7B-v0.1-japanese](https://huggingface.co/abeja/Mixtral-8x7B-v0.1-japanese) | 0.5950 | 0.5885 | 0.7500 | 0.5985 | 0.4601 | 0.6052 | 0.6404 | 0.5710 | 0.7308 |
69
- | [plamo-100b](https://huggingface.co/pfnet/plamo-100b) | 0.5908 | 0.6102 | 0.6422 | 0.6384 | 0.4565 | 0.6398 | 0.5526 | 0.5182 | 0.6731 |
70
- | [llm-jp-3.1-8x13b](https://huggingface.co/llm-jp/llm-jp-3-8x13b) | 0.5737 | 0.5839 | 0.6275 | 0.6060 | 0.4674 | 0.6110 | 0.6404 | 0.4884 | 0.6538 |
71
- | [Meta-Llama-3.1-405B](https://huggingface.co/meta-llama/Llama-3.1-405B) | 0.5724 | 0.5699 | 0.5245 | 0.4688 | 0.5435 | 0.6571 | 0.6579 | 0.6403 | 0.5962 |
72
- | [Nemotron-4-340B-Base](https://huggingface.co/mgoin/Nemotron-4-340B-Base-hf) | 0.5600 | 0.5761 | 0.6176 | 0.5062 | 0.4601 | 0.5821 | 0.6491 | 0.5776 | 0.6346 |
73
- | [Qwen2.5-72B](https://huggingface.co/Qwen/Qwen2.5-72B) | 0.5421 | 0.5419 | 0.6324 | 0.4763 | 0.4746 | 0.5677 | 0.6053 | 0.5644 | 0.6154 |
74
 
75
  ## Languages
76
 
 
61
 
62
  | Model | Micro-average | culture | custom | regional identity | geography | history | government | law | healthcare |
63
  |:---|---:|---:|---:|---:|---:|---:|---:|---:|---:|
64
+ | [sarashina2-8x70b](https://huggingface.co/sbintuitions/sarashina2-8x70b) | **0.7254** | 0.7141 | **0.7750** | **0.7607** | 0.6544 | 0.7843 | 0.7364 | 0.6321 | 0.9167 |
65
+ | [sarashina2-70b](https://huggingface.co/sbintuitions/sarashina2-70b) | 0.7246 | **0.7188** | 0.7450 | 0.7355 | **0.6728** | **0.7638** | 0.7636 | 0.6656 | 0.9167 |
66
+ | [Llama-3.3-Swallow-70B-v0.4](https://huggingface.co/tokyotech-llm/Llama-3.3-Swallow-70B-v0.4) | 0.6973 | 0.6891 | 0.7750 | 0.5894 | 0.5662 | 0.7755 | 0.7727 | 0.7826 | 0.8542 |
67
+ | [RakutenAI-2.0-8x7B](https://huggingface.co/Rakuten/RakutenAI-2.0-8x7B) | 0.6327 | 0.6219 | 0.7250 | 0.6171 | 0.5110 | 0.7143 | 0.7091 | 0.5753 | 0.8125 |
68
+ | [plamo-100b](https://huggingface.co/pfnet/plamo-100b) | 0.6033 | 0.6016 | 0.6500 | 0.6373 | 0.5037 | 0.6822 | 0.6091 | 0.5151 | 0.6875 |
69
+ | [Mixtral-8x7B-v0.1-japanese](https://huggingface.co/abeja/Mixtral-8x7B-v0.1-japanese) | 0.5929 | 0.6016 | 0.6700 | 0.5793 | 0.4926 | 0.6122 | 0.7364 | 0.5452 | 0.6667 |
70
+ | [Meta-Llama-3.1-405B](https://huggingface.co/meta-llama/Llama-3.1-405B) | 0.5712 | 0.5578 | 0.5450 | 0.4836 | 0.5000 | 0.6793 | 0.6455 | 0.6288 | 0.6875 |
71
+ | [llm-jp-3.1-8x13b](https://huggingface.co/llm-jp/llm-jp-3-8x13b) | 0.5682 | 0.5953 | 0.6350 | 0.5819 | 0.4485 | 0.5889 | 0.6273 | 0.5017 | 0.6250 |
72
+ | [Nemotron-4-340B-Base](https://huggingface.co/mgoin/Nemotron-4-340B-Base-hf) | 0.5673 | 0.5734 | 0.6150 | 0.5113 | 0.4669 | 0.5948 | 0.7273 | 0.5819 | 0.6667 |
73
+ | [Qwen2.5-72B](https://huggingface.co/Qwen/Qwen2.5-72B) | 0.5271 | 0.5219 | 0.5950 | 0.4257 | 0.4375 | 0.6064 | 0.6091 | 0.5619 | 0.6875 |
74
 
75
  ## Languages
76