Dataset cannot be loaded using `load_dataset`

#26
by m-ric - opened

Hi! It looks like the dataset is corrupted, or maybe it needs custom instructions for loading?

I'm trying to run this simple code:

from datasets import load_dataset

dataset = load_dataset("xlangai/aguvis-stage1", split="train")

Here's the error I get :

Generating train split: 327972 examples [00:07, 44981.48 examples/s]
Traceback (most recent call last):
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/packaged_modules/json/json.py", line 160, in _generate_tables
    df = pandas_read_json(f)
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/packaged_modules/json/json.py", line 38, in pandas_read_json
    return pd.read_json(path_or_buf, **kwargs)
  File "/home/aymeric/.venv/lib/python3.10/site-packages/pandas/io/json/_json.py", line 791, in read_json
    json_reader = JsonReader(
  File "/home/aymeric/.venv/lib/python3.10/site-packages/pandas/io/json/_json.py", line 905, in __init__
    self.data = self._preprocess_data(data)
  File "/home/aymeric/.venv/lib/python3.10/site-packages/pandas/io/json/_json.py", line 917, in _preprocess_data
    data = data.read()
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/utils/file_utils.py", line 827, in read_with_retries
    out = read(*args, **kwargs)
  File "/usr/lib/python3.10/codecs.py", line 322, in decode
    (result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 0: invalid start byte

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/builder.py", line 1855, in _prepare_split_single
    for _, table in generator:
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/packaged_modules/json/json.py", line 163, in _generate_tables
    raise e
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/packaged_modules/json/json.py", line 137, in _generate_tables
    pa_table = paj.read_json(
  File "pyarrow/_json.pyx", line 342, in pyarrow._json.read_json
  File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
  File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
pyarrow.lib.ArrowInvalid: JSON parse error: Invalid value. in row 0

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/aymeric/aguvis.py", line 3, in <module>
    dataset = load_dataset("xlangai/aguvis-stage1", split="train")
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/load.py", line 2084, in load_dataset
    builder_instance.download_and_prepare(
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/builder.py", line 925, in download_and_prepare
    self._download_and_prepare(
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/builder.py", line 1001, in _download_and_prepare
    self._prepare_split(split_generator, **prepare_split_kwargs)
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/builder.py", line 1742, in _prepare_split
    for job_id, done, content in self._prepare_split_single(
  File "/home/aymeric/.venv/lib/python3.10/site-packages/datasets/builder.py", line 1898, in _prepare_split_single
    raise DatasetGenerationError("An error occurred while generating the dataset") from e

Sign up or log in to comment