ettin-pretraining-data / reddit /reddit_0016-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0016-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
7bf6462 verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
c7dbda947e03d63aead0c3581045de1ee735fdfc01035553e5eb3154cd6aca2e
Pointer size:
135 Bytes
·
Size of remote file:
1.98 GB
·
Xet backed hash:
4497a8f4f38a00f78e4cc4539e5298584eef9007882b354b189d07d1f29ec402

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.