Update README.md
Browse files
README.md
CHANGED
|
@@ -47,18 +47,18 @@ To achieve high accuracy on this test, the model must possess extensive knowledg
|
|
| 47 |
|
| 48 |
## Supported Tasks and Leaderboards
|
| 49 |
|
| 50 |
-
| Model |
|
| 51 |
-
|
| 52 |
-
| [sarashina2-8x70b](https://huggingface.co/sbintuitions/sarashina2-8x70b) |
|
| 53 |
-
| [sarashina2-70b](https://huggingface.co/sbintuitions/sarashina2-70b)
|
| 54 |
-
| [Llama-3.3-Swallow-70B-v0.4](https://huggingface.co/tokyotech-llm/Llama-3.3-Swallow-70B-v0.4) |
|
| 55 |
-
| [RakutenAI-2.0-8x7B](https://huggingface.co/Rakuten/RakutenAI-2.0-8x7B) |
|
| 56 |
-
| [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) |
|
| 57 |
-
| [plamo-100b](https://huggingface.co/pfnet/plamo-100b) |
|
| 58 |
-
| [llm-jp-3.1-8x13b](https://huggingface.co/llm-jp/llm-jp-3-8x13b) |
|
| 59 |
-
| [Meta-Llama-3.1-405B](https://huggingface.co/meta-llama/Llama-3.1-405B) |
|
| 60 |
-
| [Nemotron-4-340B-Base](https://huggingface.co/mgoin/Nemotron-4-340B-Base-hf) |
|
| 61 |
-
| [Qwen2.5-72B](https://huggingface.co/Qwen/Qwen2.5-72B) |
|
| 62 |
|
| 63 |
## Languages
|
| 64 |
|
|
|
|
| 47 |
|
| 48 |
## Supported Tasks and Leaderboards
|
| 49 |
|
| 50 |
+
| Model | Micro-average | culture | custom | climate | geography | history | government | law | healthcare |
|
| 51 |
+
|:---|---:|---:|---:|---:|---:|---:|---:|---:|---:|
|
| 52 |
+
| [sarashina2-8x70b](https://huggingface.co/sbintuitions/sarashina2-8x70b) | 0.7364 | 0.722 | 0.8088 | 0.7855 | 0.6522 | 0.7839 | 0.7719 | 0.6436 | 0.8462 |
|
| 53 |
+
| [sarashina2-70b](https://huggingface.co/sbintuitions/sarashina2-70b) | 0.7245 | 0.6988 | 0.7892 | 0.7556 | 0.6558 | 0.7781 | 0.7544 | 0.6733 | 0.7885 |
|
| 54 |
+
| [Llama-3.3-Swallow-70B-v0.4](https://huggingface.co/tokyotech-llm/Llama-3.3-Swallow-70B-v0.4) | 0.695 | 0.6894 | 0.7353 | 0.6185 | 0.5688 | 0.7781 | 0.7719 0.7459 | 0.8462 |
|
| 55 |
+
| [RakutenAI-2.0-8x7B](https://huggingface.co/Rakuten/RakutenAI-2.0-8x7B) | 0.616 | 0.6056 | 0.6814 | 0.6160 | 0.4855 | 0.6888 | 0.6754 | 0.5941 | 0.6923 |
|
| 56 |
+
| [Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) | 0.2772 | 0.2671 | 0.2892 | 0.2618 | 0.2355 | 0.2767 | 0.3509 | 0.3102 | 0.3462 |
|
| 57 |
+
| [plamo-100b](https://huggingface.co/pfnet/plamo-100b) | 0.5908 | 0.6102 | 0.6422 | 0.6384 | 0.4565 | 0.6398 | 0.5526 | 0.5182 | 0.6731 |
|
| 58 |
+
| [llm-jp-3.1-8x13b](https://huggingface.co/llm-jp/llm-jp-3-8x13b) | 0.5737 | 0.5839 | 0.6275 | 0.606 | 0.4674 | 0.6110 | 0.6404 | 0.4884 | 0.6538 |
|
| 59 |
+
| [Meta-Llama-3.1-405B](https://huggingface.co/meta-llama/Llama-3.1-405B) | 0.5724 | 0.5699 | 0.5245 | 0.4688 | 0.5435 | 0.6571 | 0.6579 | 0.6403 | 0.5962 |
|
| 60 |
+
| [Nemotron-4-340B-Base](https://huggingface.co/mgoin/Nemotron-4-340B-Base-hf) | 0.5600 | 0.5761 | 0.6176 | 0.5062 | 0.4601 | 0.5821 | 0.6491 | 0.5776 | 0.6346 |
|
| 61 |
+
| [Qwen2.5-72B](https://huggingface.co/Qwen/Qwen2.5-72B) | 0.5421 | 0.5419 | 0.6324 | 0.4763 | 0.4746 | 0.5677 | 0.6053 | 0.5644 | 0.6154 |
|
| 62 |
|
| 63 |
## Languages
|
| 64 |
|