sample1 / tokenizer_config.json
yuniv's picture
Upload tokenizer
2e90e7f
raw
history blame contribute delete
274 Bytes
{
"model_max_length": 1000000000000000019884624838656,
"special_tokens_map_file": "/root/.cache/huggingface/hub/models--gogamza--kobart-base-v2/snapshots/f9f2ec35d3c32a1ecc7a3281f9626b7ec1913fed/special_tokens_map.json",
"tokenizer_class": "PreTrainedTokenizerFast"
}