ettin-pretraining-data
/
reddit
/reddit_0014-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz

Upload reddit/reddit_0014-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
75e56c4
verified
- SHA256:
- 2d974397fa14057a61c72c16bf071491322159b4d0f04756229521c140b398d6
- Pointer size:
- 135 Bytes
- Size of remote file:
- 2 GB
- Xet backed hash:
- a604dd6366f5d133df8aef39a6c77ba60b8d95b3f7efb84df57c56eb32335e17
·
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.