SOMMY-gemma-2-9b-Instruct / tokenizer.json
goodemagod's picture
Upload tokenizer.json with huggingface_hub
b712c48 verified
raw
history blame contribute delete
377 Bytes
{
"name": "llama",
"vocab": {
"bos_token": 0,
"eos_token": 2,
"unk_token": 1,
"pad_token": 3,
"mask_token": 4,
"max_vocab_size": 32000,
"special_tokens": {
"bos_token": 0,
"eos_token": 2,
"unk_token": 1,
"pad_token": 3,
"mask_token": 4
}
}
}