Tokenization of FineWeb preserving the exact records and their order, with precise validation set boundary as the 10B subset used by NanoGPT. 8427a2d verified alexjc commited on Jan 21
FineWeb10B tokenized with TokenMonster vocabulary english-28416-balanced (v1b4). 4cb82d1 verified alexjc commited on Jan 17
TokenMonster vocabulary english-28416-balanced (v1b4) subset of english-100256-balanced. e0db7ee verified alexjc commited on Jan 17
Delete prototype vocabulary and tokens for english-28416, update coming. 92d2368 verified alexjc commited on Jan 14
FineWeb10B tokenized with TokenMonster vocabulary english-28416-balanced-v1. ffc7a36 verified alexjc commited on Dec 18, 2024
Custom-filtered TokenMonster vocabulary used to create the tokens. a8df488 verified alexjc commited on Dec 4, 2024
FineWeb10B tokenized with TokenMonster vocabulary english-50256-balanced-v2. 183a0ce verified alexjc commited on Dec 4, 2024