Includes </s> in most replies despite ChatML prompting

#3
by CED6688 - opened

Not sure if anyone else has noticed, but this model seems to return at the end of most generations. In fact, I almost never see <|im_end|> token despite using ChatML prompting. Note that for this mode, is not even a listed special token.

That said, this seems to be the case with many Yi-34b fine-tunes, which also require adding as an explicit stopping string.

It is as expected, which is the same of all the CausalLM models.
And I think only EOS token as EOS is possible, not <|im_end|>

JosephusCheung changed discussion status to closed

Sign up or log in to comment