ettin-pretraining-data
/
reddit
/reddit_0021-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz

Upload reddit/reddit_0021-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
ac52c6b
verified
- SHA256:
- af8875cc9af3c7f4db13cca0988327d161c28585c3abf4bf7539aea358616552
- Pointer size:
- 135 Bytes
- Size of remote file:
- 1.95 GB
- Xet backed hash:
- 8aacb17ca535324b420ecd0b90be747cc60efec0cf6729cd0805e7ba127333e6
·
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.