--- library_name: transformers tags: [] --- - 실행결과: [werty1248/qwen-s1.1-Ko-Native-result](https://huggingface.co/datasets/werty1248/qwen-s1.1-Ko-Native-result) - think 내에 중국어 나옴 (중국어로 생각해서 추론 성공하는 경우도 있음) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6629154d55d7c289634b8c5d/kn9BD74a7Q9L14hL5iFxx.png) | Model | GSM8K | KSM | MATH | MMMLU | OMNI_MATH | Average | |-------|-------|-------|-------|-------|-------|-------| | GPT-4o | 91.21 | 22.83 | 74.45 | 68.72 | 30.75 | 57.99 | | GPT-4o-mini | 87.57 | 19.40 | 70.68 | 63.40 | 26.45 | 53.50 | | * **EXAONE-3.5-7.8B-Stratos-Ko** | 83.02 | 15.97 | 67.49 | **44.68 | 24.62 | 49.98 | | * **Qwen2.5-7B-s1.1-Ko-Native** | 76.27 | 15.48 | 66.45 | 39.57 | 23.57 | 44.26 | | -> EXAONE-3.5-7.8B-Instruct | 81.58 | 14.71 | 63.50 | ***41.49? | 21.69 | 44.19 | | Qwen2.5-14B-Instruct | 66.34 | 15.55 | 53.38 | 61.49 | 20.64 | 43.88 | | Llama-3.1-8B-Instruct | 77.79 | 7.21 | 49.01 | 47.02 | 15.92 | 39.39 | | -> Qwen2.5-7B-Instruct | 58.38 | 13.10 | 48.04 | 48.94 | 16.55 | 37.80 | | EXAONE-3.0-7.8B-Instruct | 72.33 | 7.98 | 46.79 | 37.66 | 15.35 | 36.02 | | *Ko-R1-1.5B-preview | 43.3 | ? | 73.1 | ? | 29.8 | ? |