ettin-pretraining-data
/
reddit
/reddit_0023-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz

Upload reddit/reddit_0023-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
7456612
verified
- SHA256:
- 3283e49cdd7d4fb4ba53537de6e8ef2e9846e5e02e0e4491e73a233aa8cfe390
- Pointer size:
- 135 Bytes
- Size of remote file:
- 1.97 GB
- Xet backed hash:
- 14d7cd748c7535f422073a8c5ccb9ef92b3fa9a73a599b4b1071aab5ce24a17e
·
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.