ettin-pretraining-data / reddit /reddit_0010-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0010-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
5508c77 verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
04eac1e3452080d03a34ea4279383564126097ebbecfa099e909d86eb8a4e82b
Pointer size:
135 Bytes
·
Size of remote file:
1.96 GB
·
Xet backed hash:
d8f2fdbb609823681d1abdfefe33569ab6b0f4761611b4898c5469bb2f1fdbfc

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.