Update README_ja.md
Browse files- README_ja.md +10 -10
README_ja.md
CHANGED
@@ -19,16 +19,16 @@ Accuracyは LLM の出力した文字列が正解選択肢の文字列と完全
|
|
19 |
|
20 |
| モデル | マイクロ平均 | 文化 | 風習 | 風土 | 地理 | 日本史 | 行政 | 法律 | 医療 |
|
21 |
|:---|---:|---:|---:|---:|---:|---:|---:|---:|---:|
|
22 |
-
| [sarashina2-8x70b](https://huggingface.co/sbintuitions/sarashina2-8x70b) | **0.
|
23 |
-
| [sarashina2-70b](https://huggingface.co/sbintuitions/sarashina2-70b) | 0.
|
24 |
-
| [Llama-3.3-Swallow-70B-v0.4](https://huggingface.co/tokyotech-llm/Llama-3.3-Swallow-70B-v0.4) | 0.
|
25 |
-
| [RakutenAI-2.0-8x7B](https://huggingface.co/Rakuten/RakutenAI-2.0-8x7B) | 0.
|
26 |
-
| [plamo-100b](https://huggingface.co/pfnet/plamo-100b) | 0.
|
27 |
-
| [Mixtral-8x7B-v0.1-japanese](https://huggingface.co/abeja/Mixtral-8x7B-v0.1-japanese) | 0.
|
28 |
-
| [Meta-Llama-3.1-405B](https://huggingface.co/meta-llama/Llama-3.1-405B) | 0.
|
29 |
-
| [llm-jp-3.1-8x13b](https://huggingface.co/llm-jp/llm-jp-3-8x13b) | 0.
|
30 |
-
| [Nemotron-4-340B-Base](https://huggingface.co/mgoin/Nemotron-4-340B-Base-hf) | 0.
|
31 |
-
| [Qwen2.5-72B](https://huggingface.co/Qwen/Qwen2.5-72B) | 0.
|
32 |
|
33 |
## Language
|
34 |
|
|
|
19 |
|
20 |
| モデル | マイクロ平均 | 文化 | 風習 | 風土 | 地理 | 日本史 | 行政 | 法律 | 医療 |
|
21 |
|:---|---:|---:|---:|---:|---:|---:|---:|---:|---:|
|
22 |
+
| [sarashina2-8x70b](https://huggingface.co/sbintuitions/sarashina2-8x70b) | **0.725** | 0.714 | **0.775** | **0.761** | 0.654 | **0.784** | 0.736 | 0.632 | **0.917** |
|
23 |
+
| [sarashina2-70b](https://huggingface.co/sbintuitions/sarashina2-70b) | **0.725** | **0.719** | 0.745 | 0.736 | **0.673** | 0.764 | 0.764 | 0.666 | **0.917** |
|
24 |
+
| [Llama-3.3-Swallow-70B-v0.4](https://huggingface.co/tokyotech-llm/Llama-3.3-Swallow-70B-v0.4) | 0.697 | 0.689 | **0.775** | 0.589 | 0.566 | 0.776 | **0.773** | **0.783** | 0.8542 |
|
25 |
+
| [RakutenAI-2.0-8x7B](https://huggingface.co/Rakuten/RakutenAI-2.0-8x7B) | 0.633 | 0.622 | 0.725 | 0.617 | 0.511 | 0.714 | 0.709 | 0.575 | 0.813 |
|
26 |
+
| [plamo-100b](https://huggingface.co/pfnet/plamo-100b) | 0.603 | 0.602 | 0.650 | 0.637 | 0.504 | 0.682 | 0.609 | 0.515 | 0.688 |
|
27 |
+
| [Mixtral-8x7B-v0.1-japanese](https://huggingface.co/abeja/Mixtral-8x7B-v0.1-japanese) | 0.593 | 0.602 | 0.670 | 0.579 | 0.493 | 0.612 | 0.736 | 0.545 | 0.667 |
|
28 |
+
| [Meta-Llama-3.1-405B](https://huggingface.co/meta-llama/Llama-3.1-405B) | 0.571 | 0.558 | 0.545 | 0.484 | 0.500 | 0.679 | 0.646 | 0.629 | 0.688 |
|
29 |
+
| [llm-jp-3.1-8x13b](https://huggingface.co/llm-jp/llm-jp-3-8x13b) | 0.568 | 0.595 | 0.635 | 0.582 | 0.449 | 0.589 | 0.627 | 0.502 | 0.625 |
|
30 |
+
| [Nemotron-4-340B-Base](https://huggingface.co/mgoin/Nemotron-4-340B-Base-hf) | 0.567 | 0.573 | 0.615 | 0.511 | 0.467 | 0.595 | 0.727 | 0.582 | 0.667 |
|
31 |
+
| [Qwen2.5-72B](https://huggingface.co/Qwen/Qwen2.5-72B) | 0.527 | 0.522 | 0.595 | 0.426 | 0.438 | 0.606 | 0.609 | 0.562 | 0.688 |
|
32 |
|
33 |
## Language
|
34 |
|