prototype_round3_qwen0.5-1.5B / added_tokens.json
amiguel's picture
Upload tokenizer
64c535b verified
raw
history blame contribute delete
80 Bytes
{
"<|endoftext|>": 151643,
"<|im_end|>": 151645,
"<|im_start|>": 151644
}