Model Card

Add more information here

Example Usage

from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
from peft import PeftModel

tokenizer = AutoTokenizer.from_pretrained('fineinstructions/query_templatizer', revision=None) # Load tokenizer
tokenizer.padding_side = 'left'
base_model = AutoModelForCausalLM.from_pretrained('/mnt/nlpgpu-io1/data/ajayp/output/fineinstructions/dated/2025-01-30-05:11:16/data/distill_datadreamer_output_lr3/stage_2_output_model', revision=None) # Load base model
model = PeftModel.from_pretrained(base_model, model_id='fineinstructions/query_templatizer', revision=None) # Apply adapter
pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, pad_token_id=tokenizer.pad_token_id, return_full_text=False)

inputs = ['ok now can you give 3 very speculative ideas on how to achieve unidirectional movement that results in more energy than input using magnets and/or ZPF extraction, as simple setups?']
print(pipe(inputs, max_length=131072, do_sample=False))

This model was trained with a synthetic dataset with DataDreamer 🤖💤. The synthetic dataset card and model card can be found here. The training arguments can be found here.

Downloads last month
0
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train fineinstructions/query_templatizer