ettin-pretraining-data
/
reddit
/reddit_0012-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz

Upload reddit/reddit_0012-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
0ceab84
verified
- SHA256:
- eec07de3e334060187f4283435177f31c61e75b42a9dfb88028ec64e3539843a
- Pointer size:
- 135 Bytes
- Size of remote file:
- 1.96 GB
- Xet backed hash:
- 0ce51bdfb2149c1e2effc3acb0327a1aa655ca8cb9889ebf4733893f80ca28d7
·
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.