ettin-pretraining-data / reddit /reddit_0019-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0019-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
6b98a7f verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
59a7debc6f5f4a51da05fd8fddaac8aae7bdd8e4f1cf56b561c91a72d49f5629
Pointer size:
135 Bytes
·
Size of remote file:
1.99 GB
·
Xet backed hash:
2cc8971815a5cd3d004de3f9c8da69023f5002c27cddff20a90d75bb1682ac3f

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.