ettin-pretraining-data / reddit /reddit_0022-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0022-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
ea8dea9 verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
f0b3689848a0b09e1dbed647f61940a60f213272ad9cb1bd58d619b449b7ca6e
Pointer size:
135 Bytes
·
Size of remote file:
1.97 GB
·
Xet backed hash:
d2eb7c36c8836d45834e6a6ac73fdeff553c347ae1d3c357942d47e17b195b51

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.