ettin-pretraining-data / reddit /reddit_0009-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0009-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
f289eaf verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
7733130b55327b1da1d433aab1b62dae91eaa78015899ec3752750654888f35c
Pointer size:
135 Bytes
·
Size of remote file:
1.99 GB
·
Xet backed hash:
e6fc2071d1cde08a3c4abdc8fac98ecb3accdf95a70fc45cede0766104f9f622

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.