WIATS: Weather Intervention-Aware Time Series Benchmark
Collection
3 items
•
Updated
Error code: FeaturesError Exception: ArrowInvalid Message: JSON parse error: Invalid value. in row 0 Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 160, in _generate_tables df = pandas_read_json(f) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 38, in pandas_read_json return pd.read_json(path_or_buf, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 815, in read_json return json_reader.read() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1025, in read obj = self._get_object_parser(self.data) File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1051, in _get_object_parser obj = FrameParser(json, **kwargs).parse() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1187, in parse self._parse() File "/src/services/worker/.venv/lib/python3.9/site-packages/pandas/io/json/_json.py", line 1403, in _parse ujson_loads(json, precise_float=self.precise_float), dtype=None ValueError: Unexpected character found when decoding 'null' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 228, in compute_first_rows_from_streaming_response iterable_dataset = iterable_dataset._resolve_features() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 3357, in _resolve_features features = _infer_features_from_batch(self.with_format(None)._head()) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 2111, in _head return next(iter(self.iter(batch_size=n))) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 2315, in iter for key, example in iterator: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1856, in __iter__ for key, pa_table in self._iter_arrow(): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1878, in _iter_arrow yield from self.ex_iterable._iter_arrow() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 476, in _iter_arrow for key, pa_table in iterator: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 323, in _iter_arrow for key, pa_table in self.generate_tables_fn(**gen_kwags): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 163, in _generate_tables raise e File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 137, in _generate_tables pa_table = paj.read_json( File "pyarrow/_json.pyx", line 308, in pyarrow._json.read_json File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: JSON parse error: Invalid value. in row 0
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
The dataset is organized into the following structure:
|-- subdataset1
| |-- raw_data # Original data files
| |-- time_series # Rule-based Imputed data files
| | |-- id_1.parquet # Time series data for each subject can be multivariate, can be in csv, parquet, etc.
| | |-- id_2.parquet
| | |-- ...
| | |-- id_info.json # Metadata for each subject
| |-- weather
| | |-- location_1
| | | |-- raw_data
| | | | |-- daily_weather_raw_????.json
| | | | |-- ...
| | | | |-- daily_weather_????.csv
| | | | |-- ...
| | | | |-- hourly_weather_????.csv
| | | | |-- ...
| | | |-- weather_report (can be flattened and use regex to extract the version)
| | | | |-- version_1
| | | | | |-- weather_report_????.json
| | | | | |-- ...
| | | | |-- version_2
| | | | |-- ...
| | | |-- report_embedding # embedding for the weather report
| | | | |-- version_1
| | | | | |-- report_embedding_????.pkl
| | | | | |-- ...
| | | | |-- version_2
| | | | |-- ...
| | |-- location_2
| | |-- ...
| | |-- merged_report_embedding # merged embedding for multiple needed locations (optional)
| | | |-- version_1
| | | | |-- report_embedding_????.pkl
| | | | |-- ...
| | | |-- version_2
| | | |-- ...
| |-- scripts # Scripts for data processing, model training, and evaluation
| |-- id_info.json # Metadata for whole dataset without preprocessing
| |-- static_info.json # Static information for this dataset, including the dataset information, channel information, downtime reasons.
| |-- static_info_embeddings.pkl
| |-- slim_data (optional)
| |-- full_data (optional) # intermediate data during the data processing
|-- subdataset2
|-- ......
The id_info.json
file contains metadata for each subject in the dataset. Extracted from the raw dataset. The structure is as follows:
{
"id_1": {
"len": 1000, # Length of the time series data
"sensor_downtime": {
1: {
"time": [yyyy-mm-dd hh:mm:ss, yyyy-mm-dd hh:mm:ss],
"index": [start_index, end_index]
},
2: {
"time": [yyyy-mm-dd hh:mm:ss, yyyy-mm-dd hh:mm:ss],
"index": [start_index, end_index]
},
...
},
"other_info_1": "value_1", # Other information about the subject customizable entry
"other_info_2": "value_2",
...
},
"id_2": ...
}
The static_info.json
file contains static information for the whole dataset. The structure is as follows:
{
"general_info": "description of the dataset",
"downtime_prompt": "",
"channel_info": {
"id_1": "id_1 is xxx located in xxx",
"id_2": "id_2 is xxx located in xxx",
...
},
}