ettin-pretraining-data / mlfoundations-dclm-baseline-1.0-parquet-sampled-v3 /split_10047-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload mlfoundations-dclm-baseline-1.0-parquet-sampled-v3/split_10047-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
b64b603 verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
3f529c982fb7ce894b4459c8de1090588bd2dea301c8ee81c221d214f429d58a
Pointer size:
133 Bytes
·
Size of remote file:
75.5 MB
·
Xet backed hash:
7978074ff67eae84bf892be37488b0bd7f0544cc062390bb534a477d4088762f

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.