ettin-pretraining-data
/
reddit
/reddit_0006-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz

Upload reddit/reddit_0006-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
d4a31eb
verified
- SHA256:
- 9be6efce5c5b4adbd8574642dfc1b8acf31627679ee599dcb47581f41ecc08dd
- Pointer size:
- 135 Bytes
- Size of remote file:
- 2.05 GB
- Xet backed hash:
- 5fe163dd1fa57149a60d4646ec5f5b5918d18cee5cd6e1af5430da680d1709b6
·
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.