why tokenizer_config.json changed for AWQ model.
#7
by
rockcat-miao
- opened
compared with deepseek full version, more special token added, chat_template also be changed.
and new chat_template in tokenizer_config.json cannot be Interpreted by vllm tool use parser.
Because DeepSeek decided that they should update the chat template. This repo uses the old version of said template, I didn't bother to update it. You can just use their newer version though.
Also, my version included a prefill ability, that if a messages list ends with assistant, the model will complete the assistant instead of opening a new response, this is known as prefill.
appreciate your information
rockcat-miao
changed discussion status to
closed