Dataset Viewer
The dataset viewer is not available for this subset.
Cannot get the split names for the config 'default' of the dataset.
Exception:    SplitsNotFoundError
Message:      The split names could not be parsed from the dataset config.
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 299, in get_dataset_config_info
                  for split_generator in builder._split_generators(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/webdataset/webdataset.py", line 83, in _split_generators
                  raise ValueError(
              ValueError: The TAR archives of the dataset should be in WebDataset format, but the files in the archive don't share the same prefix or the same types.
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/split_names.py", line 65, in compute_split_names_from_streaming_response
                  for split in get_dataset_split_names(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 353, in get_dataset_split_names
                  info = get_dataset_config_info(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 304, in get_dataset_config_info
                  raise SplitsNotFoundError("The split names could not be parsed from the dataset config.") from err
              datasets.inspect.SplitsNotFoundError: The split names could not be parsed from the dataset config.

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

Arkastone Test Vectors

This dataset contains structured, versioned, and reproducible test vectors used in the Arkastone simulation framework for validating various encoder and decoder modules in a 5G-like communication stack.

The test vectors are grouped by subsystem β€” coding, tx, and rx β€” and organized for clarity, reproducibility, and CI/CD integration.


πŸ“ Folder Structure

  • coding/ – Reference encoder data, matrices, and metadata (e.g., Polar, NR5G)
  • rx/ – Test inputs and expected outputs for decoder validation (SC, SCF, etc.)
  • tx/ – Encoder test vectors for LTE-like Turbo Convolutional Coding (TBCC)

βœ… Highlights

  • Files follow naming convention:
    ppile_<scheme>_n<N>_k<K>_[extra]_Q<quant>.in/.out
  • All .in files are input to decoder/encoder modules
  • All .out files represent validated output (bit-level)
  • Large metadata is compressed (.tar), grouped under polar_nr5g_wrapper
  • Quantization schemes follow Q0 fixed-point rules (unless noted)
  • Readme files are provided in each directory for more detailed usage

πŸ’‘ Design Principles

  • Avoid deep nesting for visibility and accessibility
  • Ensure reproducibility across CI and local testing
  • Separate heavy test vectors from source code repositories
  • Enable scalable testing without Git LFS constraints
  • Hosted externally for easy integration and public collaboration

πŸ“œ License

All test vectors and associated metadata are released under the MIT License unless stated otherwise.


🧠 Notes

  • This structure is optimized for Hugging Face Datasets and integrates cleanly with Python-based simulations.
  • You may reference these vectors from simulation scripts or GitHub Actions using a downloader script and manifest.

Maintained by Furkan Ercan

Downloads last month
85