ettin-pretraining-data
/
reddit
/reddit_0011-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz

Upload reddit/reddit_0011-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
93fd10f
verified
- SHA256:
- 82e52e84adb935ef073d3a0907ead66e936e0cd12a5deb5b38dcef2c0d96ab48
- Pointer size:
- 135 Bytes
- Size of remote file:
- 1.97 GB
- Xet backed hash:
- a5bbab4ccff108bc49714562fa1ba885b4ad5611f0bcf0166d79b071495a24d5
·
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.