Datasets:
Formats:
csv
Size:
10M - 100M
Tags:
humanoid-robotics
fall-prediction
machine-learning
sensor-data
robotics
temporal-convolutional-networks
License:
File size: 6,380 Bytes
2f6a432 8168cce ee47c2a 8168cce ee47c2a 8168cce bc5a291 bbeb72c bc5a291 bbeb72c bc5a291 bbeb72c bc5a291 bbeb72c bc5a291 bbeb72c bc5a291 2f6a432 66b0510 ee47c2a 66b0510 e42a20e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 |
---
title: "Fall Prediction Dataset for Humanoid Robots"
datasets:
- naos-fall-prediction
tags:
- humanoid-robotics
- fall-prediction
- machine-learning
- sensor-data
- robotics
- temporal-convolutional-networks
license:
- apache-2.0
---
# Fall Prediction Dataset for Humanoid Robots
## Dataset Summary
This dataset consists of **37.9 hours of real-world sensor data** collected from **20 Nao humanoid robots** over the course of one year in various test environments, including RoboCup soccer matches. The dataset includes **18.3 hours of walking data**, featuring **2519 falls**. It captures a wide range of activities such as omni-directional walking, collisions, standing up, and falls on various surfaces like artificial turf and carpets.
The dataset is primarily designed to support the development and evaluation of fall prediction algorithms for humanoid robots. It includes data from multiple sensors, such as gyroscopes, accelerometers, and force-sensing resistors (FSR), recorded at a high frequency to track robot movements and falls with precision.
Using this dataset, the **RePro-TCN model** was developed, which outperforms existing fall prediction methods under real-world conditions. This model leverages **temporal convolutional networks (TCNs)** and incorporates advanced training techniques like **progressive forecasting** and **relaxed loss formulations**.
## Dataset Structure
- **Duration**: 37.9 hours total, 18.3 hours of walking
- **Falls**: 2519 falls during walking scenarios
- **Data Types**: Gyroscope (roll, pitch), accelerometer (x, y, z), body angle, and force-sensing resistors (FSR) per foot.
## Use Cases
- Humanoid robot fall prediction and prevention
- Robot control algorithm benchmarking
- Temporal sequence modeling in robotics
## Licensing
This dataset is shared under the **apache-2.0** license, allowing use and modification with proper attribution, as long as derivatives are shared alike.
## Citation
If you use this dataset in your research, please cite it as follows:
"A Large-Scale Dataset for Humanoid Robotics Enabling a Novel Data-Driven Fall Prediction"
Oliver Urbann, Julian Eßer, Diana Kleingarn, Arne Moos, Dominik Brämer, Piet Brömmel, Nicolas Bach, Christian Jestel, Aaron Larisch, Alice Kirchheim,
2025 IEEE International Conference on Robotics and Automation (ICRA)
## How to Use the Dataset
To get started with the **Fall Prediction Dataset for Humanoid Robots**, follow the steps below:
### 0. Clone the repository
Please make sure that you have installed git large file support (git-lfs) before cloning this repository.
### 1. Set Up a Virtual Environment
It's recommended to create a virtual environment to isolate dependencies. You can do this with the following command:
```bash
python -m venv .venv
```
After creating the virtual environment, activate it:
- On **Windows**:
```bash
.venv\Scripts\activate
```
- On **macOS/Linux**:
```bash
source .venv/bin/activate
```
### 2. Install Dependencies
Once the virtual environment is active, install the necessary packages by running:
```bash
pip install -r requirements.txt
```
If you have trouble downloading the requirements, check your internet connection. Alternatively, try increasing the pip timeout or upgrading your pip installation:
```bash
# Increase the timeout by 120 seconds
pip install --default-timeout=120 -r requirements.txt
# or upgrade pip
python -m pip install --upgrade pip
```
### 3. Run the Example Script
To load and use the plain csv dataset for training a simple LSTM model, run the `plain_dataset_usage_example.py` script (RAM utilisation exceeds 16 GB):
```bash
python plain_dataset_usage_example.py
```
This script demonstrates how to:
- Load the dataset
- Select the relevant sensor columns
- Split the data into training and test sets
- Train a basic LSTM model to predict falls
- Evaluate the model on the test set
To load and use a already prepared dataset, with reduced RAM utilisation, for training a simple LSTM model, run the `lightweight_dataset_usage_example.py` script (RAM utilisation less than 2 GB):
```bash
python lightweight_dataset_usage_example.py
```
This script demonstrates how to:
- Convert the csv dataset into a memory mapped file
- Load the memory mapped version of the dataset
- Train a basic LSTM model to predict falls
- Evaluate the model on the test set
The script `convert_and_load_dataset.py` used by the lightweight example demonstrates how to:
- Select the relevant sensor columns
- Split the data into training and test sets
Make sure to check the scripts and adjust the dataset paths if necessary. For further details, see the comments and docstrings within the scripts.
---
license: apache-2.0
---
---
# Dataset Collaboration Initiative
## Introduction
In an effort to enhance research, development, and collaboration in the field of humanoid robotics, we are launching a major initiative aimed at unifying and integrating datasets from various humanoid robots. The aim of this endeavour is to foster innovation, improve accessibility and support the wider robotics community.
## Objectives
- **Dataset Unification**
We aim to create a comprehensive and standardized collection of humanoid robot datasets, enabling researchers and developers to work with diverse data in a more cohesive manner.
- **Tools & Guidelines**
To simplify and optimize the utilization of this dataset repository, we will provide essential tools and guidelines, ensuring a seamless experience for contributors and users alike.
- **Open & Inclusive Contribution**
All contributions of datasets that meet the minimum requirements are welcome. We encourage participation from a wide range of robotics projects to foster a richer, more diverse collection of datasets.
## Join Us!
If you are working with humanoid robots and have datasets to share, we invite you to contribute to this initiative. Together, we can build a valuable resource that propels advancements in humanoid robotics.
For those interested in making a contribution to this project, the method of contact is via the community tab discussions or if you like by writing to oliver dot urbann at tu-dortmund.de.
Stay tuned for upcoming updates with further details, tools, and submission guidelines available at the [HumanoidDataset Repository](https://github.com/NaoDevils/HumanoidDataset).
|