Missing chat_template in tokenizer_config.json

#1
by ialhashim - opened

I was getting an error when running the provided sample code:

ValueError: Cannot use chat template functions because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating

By providing a template I was able to make it run:

## Comment the code to generate the model inputs
# model_inputs = tokenizer.apply_chat_template(
#     messages, add_generation_prompt=True, return_tensors="pt"
# ).to(model.device)

# Loading a chat_template from another config file, e.g. 'Qwen2.5-Coder-3B-Instruct'
import json
with open("tokenizer_config.json", "r", encoding="utf-8") as f: config = json.load(f)
chat_template = config.get("chat_template")

text = tokenizer.apply_chat_template(
    messages,
    tokenize=False,
    add_generation_prompt=True,
    chat_template=chat_template
)
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)

Hey sorry about this - looking

Sign up or log in to comment