ettin-pretraining-data / reddit /reddit_0020-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0020-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
2f434eb verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
8eee17e8318ca046ccc9fc02a5b8596e6dff6411f7271688637c306f26cf51e3
Pointer size:
135 Bytes
·
Size of remote file:
1.96 GB
·
Xet backed hash:
41ef68cf015aafb662733bfa7e236c261a8fc25d9cd8aa68df5d3edc74cdb8cc

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.