ettin-pretraining-data
/
reddit
/reddit_0005-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz

Upload reddit/reddit_0005-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
efa2885
verified
- SHA256:
- f81b144ce532b99ee4a3e6095845b45e3eb730618d75bc7b8a86a49738135b0a
- Pointer size:
- 135 Bytes
- Size of remote file:
- 2.03 GB
- Xet backed hash:
- 9e6d23b3180608b781d39fed1bfcc08736fadfb55bbd63812db2db63d5c3c053
·
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.