#test this from transformers import AutoTokenizer

from peft import PeftModel from transformers import AutoModelForCausalLM from unsloth import FastLanguageModel

base_model = FastLanguageModel.from_pretrained("FreedomIntelligence/AceGPT-7B")

model = PeftModel.from_pretrained(base_model, "hamywaleed/tabib_beetlware_v1")

tokenizer = AutoTokenizer.from_pretrained("FreedomIntelligence/AceGPT-7B")

Downloads last month
6
Safetensors
Model size
3.64B params
Tensor type
F32
FP16
U8
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support

Model tree for beetlware/beetelware-tabep

Adapter
(1)
this model