view reply There is a bug in the chat_template in the tokenizer_config.json for meta-llama/Llama-2-7b-chat-hf and meta-llama/Llama-2-70b-chat-hf which is easy to fix, but who should I inform so that it can be fixed?
view reply NICE! Does this apply to all models in serverless and deployed endpoints, or just models that have a correct chat_template in tokenizer_config.json?