File size: 621 Bytes
6537073
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
---
license: mit
base_model: meta-llama/Meta-Llama-3.1-8B-Instruct
tags:
- llama
- mmlu
- fine-tuned
- windyfllm
datasets:
- cais/mmlu
---

# WindyFLLM 2.2 ๐ŸŒช๏ธ

MMLU ๋ฐ์ดํ„ฐ์…‹์œผ๋กœ ํŒŒ์ธํŠœ๋‹๋œ meta-llama/Meta-Llama-3.1-8B-Instruct ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.

## ์‚ฌ์šฉ๋ฒ•

```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel

base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct")
model = PeftModel.from_pretrained(base_model, "tklohj/windyfllm2.2")
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct")
```