ettin-pretraining-data
/
reddit
/reddit_0007-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz

Upload reddit/reddit_0007-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
038a332
verified
- SHA256:
- 1e5627a5b25ca197c4633acc9613d17d0e264f1006042b99916709085c2d90ae
- Pointer size:
- 135 Bytes
- Size of remote file:
- 1.97 GB
- Xet backed hash:
- c953799da7b44138775aa2c228939b67d0bb1e32830fb00a848b94fbf006c87f
·
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.