ettin-pretraining-data / reddit /reddit_0013-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0013-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
56fcfbd verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
423de7da7aa536fc980e809b6695a1ecee01177eadfd6988718d71d3075cac60
Pointer size:
135 Bytes
·
Size of remote file:
1.97 GB
·
Xet backed hash:
3e94721c48aa168433dc455507dcd266c0782be906b489a7eb1ee32ef40ef66e

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.