ettin-pretraining-data / mlfoundations-dclm-baseline-1.0-parquet-sampled-v4 /split_12312-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload mlfoundations-dclm-baseline-1.0-parquet-sampled-v4/split_12312-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
3b291c2 verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
d59fb0c38f64572fd4d7382f178919a42f6ea8753c30b908ebd90ab3c60a9d9f
Pointer size:
133 Bytes
·
Size of remote file:
68 MB
·
Xet backed hash:
2f8824765ab780d7129165db9a0f3a0249b1f2775d912b653f9087c493919135

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.