ettin-pretraining-data
/
reddit
/reddit_0018-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz

Upload reddit/reddit_0018-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
97dc924
verified
- SHA256:
- 16331d79fd29cc9a0b0214dfb699e84113b27552578649f137d894bb4e12e07b
- Pointer size:
- 135 Bytes
- Size of remote file:
- 1.97 GB
- Xet backed hash:
- 06e8b163e3aad2c6d02ec5d6baf9dcf1694400637f66d20bcd2676259dd3f7b8
·
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.