Datasets:

Modalities:
Tabular
Text
Formats:
parquet
Languages:
English
ArXiv:
Libraries:
Datasets
Dask
License:

size of parquet files

#2
by pszemraj - opened

Hello,

Just wanted to reach out and ask if there is a reason the individual parquet files for this dataset are an order of magnitude larger than normal (~2 GB vs ~200mb for most datasets)? When trying to use this dataset with streaming I've had the runs crash running out of CPU ram (>100 GB of usage almost right away) while other datasets like fineweb-edu, dclm-baseline-parquet are fine.

Also, this would perhaps be the first use case of @parquet-converter converting datasets already in.parquet to parquet again that benefits end users, but it seems that a) the filesize of the conversions is still quite large and b) it only did some of them

image.png

would it be possible to have an exception and have the parquet converter re-run this dataset and upload everything to be "normal size"?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment