Missing chat_template when using VLLM to serve the model
#5
by
leihhami
- opened
VLLM can serve the model, but query will return the following error.ValueError: As of transformers v4.44, default chat template is no longer allowed, so you must provide a chat template if the tokenizer does not define one.
Tried some templates from vllm https://github.com/vllm-project/vllm/tree/main/examples, but they are not working either.
Hi,
Apologies for the late reply, thanks for reaching out to us. I'm suspecting that the issue lies in the transformers version compatibility. Could you please retry after upgrading the transformers to latest version. pip install -q -U transformers. Please let me know if you required any additional assistance.
Thanks.