--- license: mit base_model: meta-llama/Meta-Llama-3.1-8B-Instruct tags: - llama - mmlu - fine-tuned - windyfllm datasets: - cais/mmlu --- # WindyFLLM 2.2 πŸŒͺ️ MMLU λ°μ΄ν„°μ…‹μœΌλ‘œ νŒŒμΈνŠœλ‹λœ meta-llama/Meta-Llama-3.1-8B-Instruct λͺ¨λΈμž…λ‹ˆλ‹€. ## μ‚¬μš©λ²• ```python from transformers import AutoModelForCausalLM, AutoTokenizer from peft import PeftModel base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct") model = PeftModel.from_pretrained(base_model, "tklohj/windyfllm2.2") tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3.1-8B-Instruct") ```