ettin-pretraining-data / reddit /reddit_0021-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz
orionweller's picture
Upload reddit/reddit_0021-tokenized-chunked-1024-512-128-backfill-nodups.jsonl.tar.gz with huggingface_hub
ac52c6b verified
This file is stored with Xet . It is too big to display, but you can still download it.

Large File Pointer Details

( Raw pointer file )
SHA256:
af8875cc9af3c7f4db13cca0988327d161c28585c3abf4bf7539aea358616552
Pointer size:
135 Bytes
·
Size of remote file:
1.95 GB
·
Xet backed hash:
8aacb17ca535324b420ecd0b90be747cc60efec0cf6729cd0805e7ba127333e6

Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.