File size: 621 Bytes
6537073 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 |
---
license: mit
base_model: meta-llama/Meta-Llama-3.1-8B-Instruct
tags:
- llama
- mmlu
- fine-tuned
- windyfllm
datasets:
- cais/mmlu
---
# WindyFLLM 2.2 ๐ช๏ธ
MMLU ๋ฐ์ดํฐ์
์ผ๋ก ํ์ธํ๋๋ meta-llama/Meta-Llama-3.1-8B-Instruct ๋ชจ๋ธ์
๋๋ค.
## ์ฌ์ฉ๋ฒ
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct")
model = PeftModel.from_pretrained(base_model, "tklohj/windyfllm2.2")
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct")
```
|