ettin-pretraining-data / reddit /reddit_0017-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0017-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
21167c5 verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
c798fbe04a9b48d644569568942dac14046433b418030821003befc4f724bad7
Pointer size:
135 Bytes
·
Size of remote file:
1.97 GB
·
Xet backed hash:
c020ac788bb75995d48e2174c1b25b1769d369133fc3e0d5d74d463bdd30ffd0

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.