ettin-pretraining-data / reddit /reddit_0002-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0002-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
1c16250 verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
907ce14ac7f417d7b442aae9db04a3f26f4f8539683ff58124711436fcdabee2
Pointer size:
135 Bytes
·
Size of remote file:
2.08 GB
·
Xet backed hash:
8e13aedfeb387f5a703a7701d606bfab39b00d827724b03917993d7c074ff818

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.