ettin-pretraining-data / reddit /reddit_0001-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0001-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
3f4234d verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
2aa8de6284a5bb56e124b17a4b5e294bd21e3b3fefe8b95737b36d028c04abda
Pointer size:
135 Bytes
·
Size of remote file:
2.06 GB
·
Xet backed hash:
a21a733e5e0859c626fff28d957bacdab983fee97135d97dcd366dc99b096ec9

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.