ettin-pretraining-data / mlfoundations-dclm-baseline-1.0-parquet-sampled-v4 /split_10099-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload mlfoundations-dclm-baseline-1.0-parquet-sampled-v4/split_10099-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
16bb78c verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
50ad0efb2eda40ff318e237bc408be21fa412f1150e05cba48922b2c94cfd5cd
Pointer size:
133 Bytes
·
Size of remote file:
62.8 MB
·
Xet backed hash:
87a26b15334cfe304e77494947741288f33fb1e51d6fa9f50daf21f35eaffbe0

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.