--- base_model: - benhaotang/Phi-4-llama-t1-full - prithivMLmods/Phi-4-QwQ - win10/Phi-4-llama-t1-lora library_name: transformers tags: - mergekit - merge datasets: - NovaSky-AI/Sky-T1_data_17k license: mit --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [prithivMLmods/Phi-4-QwQ](https://huggingface.co/prithivMLmods/Phi-4-QwQ) as a base. ### Models Merged The following models were included in the merge: * [benhaotang/Phi-4-llama-t1-full](https://huggingface.co/benhaotang/Phi-4-llama-t1-full) but actually [win10/Phi-4-llama-t1-lora](https://huggingface.co/win10/Phi-4-llama-t1-lora), this is who and where you should really thank. * [prithivMLmods/Phi-4-QwQ](https://huggingface.co/prithivMLmods/Phi-4-QwQ) ### Running - With Ollama ``` ollama run hf.co/benhaotang/phi4-qwq-sky-t1-Q4_K_M-GGUF ``` I suggest adding `SYSTEM "You are a helpful AI asistent. You always think step by step."` to triger step by step reasoning. - With pytorch ``` import transformers tokenizer = AutoTokenizer.from_pretrained("mircosoft/phi-4") pipeline = transformers.pipeline( "text-generation", model="benhaotang/phi4-qwq-sky-t1", tokenizer=tokenizer, device_map="auto", ) messages = [ {"role": "system", "content": "You are a helpful AI asistent. You always think step by step."}, {"role": "user", "content": "Give me a short intodcution to renormalization group(RG) flow in physcis?"}, ] outputs = pipeline(messages, max_new_tokens=128) print(outputs[0]["generated_text"]) ``` ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: prithivMLmods/Phi-4-QwQ #no parameters necessary for base model - model: benhaotang/Phi-4-llama-t1-full parameters: density: 0.5 weight: 0.5 - model: prithivMLmods/Phi-4-QwQ parameters: density: 0.5 weight: 0.5 merge_method: ties base_model: prithivMLmods/Phi-4-QwQ parameters: normalize: false int8_mask: true dtype: float16 ```