teruo6939 commited on
Commit
a412524
·
verified ·
1 Parent(s): 518b679

Update README_ja.md

Browse files
Files changed (1) hide show
  1. README_ja.md +10 -10
README_ja.md CHANGED
@@ -19,16 +19,16 @@ Accuracyは LLM の出力した文字列が正解選択肢の文字列と完全
19
 
20
  | モデル | マイクロ平均 | 文化 | 風習 | 風土 | 地理 | 日本史 | 行政 | 法律 | 医療 |
21
  |:---|---:|---:|---:|---:|---:|---:|---:|---:|---:|
22
- | [sarashina2-8x70b](https://huggingface.co/sbintuitions/sarashina2-8x70b) | **0.7254** | 0.7141 | **0.7750** | **0.7607** | 0.6544 | **0.7843** | 0.7364 | 0.6321 | **0.9167** |
23
- | [sarashina2-70b](https://huggingface.co/sbintuitions/sarashina2-70b) | 0.7246 | **0.7188** | 0.7450 | 0.7355 | **0.6728** | 0.7638 | 0.7636 | 0.6656 | **0.9167** |
24
- | [Llama-3.3-Swallow-70B-v0.4](https://huggingface.co/tokyotech-llm/Llama-3.3-Swallow-70B-v0.4) | 0.6973 | 0.6891 | **0.7750** | 0.5894 | 0.5662 | 0.7755 | **0.7727** | **0.7826** | 0.8542 |
25
- | [RakutenAI-2.0-8x7B](https://huggingface.co/Rakuten/RakutenAI-2.0-8x7B) | 0.6327 | 0.6219 | 0.7250 | 0.6171 | 0.5110 | 0.7143 | 0.7091 | 0.5753 | 0.8125 |
26
- | [plamo-100b](https://huggingface.co/pfnet/plamo-100b) | 0.6033 | 0.6016 | 0.6500 | 0.6373 | 0.5037 | 0.6822 | 0.6091 | 0.5151 | 0.6875 |
27
- | [Mixtral-8x7B-v0.1-japanese](https://huggingface.co/abeja/Mixtral-8x7B-v0.1-japanese) | 0.5929 | 0.6016 | 0.6700 | 0.5793 | 0.4926 | 0.6122 | 0.7364 | 0.5452 | 0.6667 |
28
- | [Meta-Llama-3.1-405B](https://huggingface.co/meta-llama/Llama-3.1-405B) | 0.5712 | 0.5578 | 0.5450 | 0.4836 | 0.5000 | 0.6793 | 0.6455 | 0.6288 | 0.6875 |
29
- | [llm-jp-3.1-8x13b](https://huggingface.co/llm-jp/llm-jp-3-8x13b) | 0.5682 | 0.5953 | 0.6350 | 0.5819 | 0.4485 | 0.5889 | 0.6273 | 0.5017 | 0.6250 |
30
- | [Nemotron-4-340B-Base](https://huggingface.co/mgoin/Nemotron-4-340B-Base-hf) | 0.5673 | 0.5734 | 0.6150 | 0.5113 | 0.4669 | 0.5948 | 0.7273 | 0.5819 | 0.6667 |
31
- | [Qwen2.5-72B](https://huggingface.co/Qwen/Qwen2.5-72B) | 0.5271 | 0.5219 | 0.5950 | 0.4257 | 0.4375 | 0.6064 | 0.6091 | 0.5619 | 0.6875 |
32
 
33
  ## Language
34
 
 
19
 
20
  | モデル | マイクロ平均 | 文化 | 風習 | 風土 | 地理 | 日本史 | 行政 | 法律 | 医療 |
21
  |:---|---:|---:|---:|---:|---:|---:|---:|---:|---:|
22
+ | [sarashina2-8x70b](https://huggingface.co/sbintuitions/sarashina2-8x70b) | **0.725** | 0.714 | **0.775** | **0.761** | 0.654 | **0.784** | 0.736 | 0.632 | **0.917** |
23
+ | [sarashina2-70b](https://huggingface.co/sbintuitions/sarashina2-70b) | **0.725** | **0.719** | 0.745 | 0.736 | **0.673** | 0.764 | 0.764 | 0.666 | **0.917** |
24
+ | [Llama-3.3-Swallow-70B-v0.4](https://huggingface.co/tokyotech-llm/Llama-3.3-Swallow-70B-v0.4) | 0.697 | 0.689 | **0.775** | 0.589 | 0.566 | 0.776 | **0.773** | **0.783** | 0.8542 |
25
+ | [RakutenAI-2.0-8x7B](https://huggingface.co/Rakuten/RakutenAI-2.0-8x7B) | 0.633 | 0.622 | 0.725 | 0.617 | 0.511 | 0.714 | 0.709 | 0.575 | 0.813 |
26
+ | [plamo-100b](https://huggingface.co/pfnet/plamo-100b) | 0.603 | 0.602 | 0.650 | 0.637 | 0.504 | 0.682 | 0.609 | 0.515 | 0.688 |
27
+ | [Mixtral-8x7B-v0.1-japanese](https://huggingface.co/abeja/Mixtral-8x7B-v0.1-japanese) | 0.593 | 0.602 | 0.670 | 0.579 | 0.493 | 0.612 | 0.736 | 0.545 | 0.667 |
28
+ | [Meta-Llama-3.1-405B](https://huggingface.co/meta-llama/Llama-3.1-405B) | 0.571 | 0.558 | 0.545 | 0.484 | 0.500 | 0.679 | 0.646 | 0.629 | 0.688 |
29
+ | [llm-jp-3.1-8x13b](https://huggingface.co/llm-jp/llm-jp-3-8x13b) | 0.568 | 0.595 | 0.635 | 0.582 | 0.449 | 0.589 | 0.627 | 0.502 | 0.625 |
30
+ | [Nemotron-4-340B-Base](https://huggingface.co/mgoin/Nemotron-4-340B-Base-hf) | 0.567 | 0.573 | 0.615 | 0.511 | 0.467 | 0.595 | 0.727 | 0.582 | 0.667 |
31
+ | [Qwen2.5-72B](https://huggingface.co/Qwen/Qwen2.5-72B) | 0.527 | 0.522 | 0.595 | 0.426 | 0.438 | 0.606 | 0.609 | 0.562 | 0.688 |
32
 
33
  ## Language
34