ettin-pretraining-data / reddit /reddit_0008-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0008-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
9fe915a verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
61d37a4711498034dfdd22704c4f314f16d5706ad94db07da7b729a1c2505c06
Pointer size:
135 Bytes
·
Size of remote file:
1.96 GB
·
Xet backed hash:
7ab24a36a7e431658ddf1f4db1033b96428c28593e3c2fcef8a7bfe39f9d626f

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.