|
--- |
|
license: mit |
|
base_model: meta-llama/Meta-Llama-3.1-8B-Instruct |
|
tags: |
|
- llama |
|
- mmlu |
|
- fine-tuned |
|
- windyfllm |
|
datasets: |
|
- cais/mmlu |
|
--- |
|
|
|
# WindyFLLM 2.2 πͺοΈ |
|
|
|
MMLU λ°μ΄ν°μ
μΌλ‘ νμΈνλλ meta-llama/Meta-Llama-3.1-8B-Instruct λͺ¨λΈμ
λλ€. |
|
|
|
## μ¬μ©λ² |
|
|
|
```python |
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
from peft import PeftModel |
|
|
|
base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct") |
|
model = PeftModel.from_pretrained(base_model, "tklohj/windyfllm2.2") |
|
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct") |
|
``` |
|
|