Datasets:
Update README.md
Browse files
README.md
CHANGED
@@ -141,6 +141,22 @@ Each UI element in the hierarchy includes:
|
|
141 |
- Functional classification (e.g., "login" for text buttons, "cart" for icons)
|
142 |
- Original properties (bounds, class, resource-id, etc.)
|
143 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
144 |
## Dataset Creation
|
145 |
### Curation Rationale
|
146 |
The dataset was created to expose the semantic meaning of mobile UI elements - what they represent and how they function. While prior datasets captured visual design, this semantic layer enables deeper understanding of interface functionality across applications, supporting more advanced design tools and research.
|
|
|
141 |
- Functional classification (e.g., "login" for text buttons, "cart" for icons)
|
142 |
- Original properties (bounds, class, resource-id, etc.)
|
143 |
|
144 |
+
# Rico FiftyOne Dataset Structure
|
145 |
+
|
146 |
+
**Core Fields:**
|
147 |
+
- `metadata`: EmbeddedDocumentField - Image properties (size, dimensions)
|
148 |
+
- `ui_vector`: ListField(FloatField) - UI embedding representation
|
149 |
+
- `ui_viz`: ListField(FloatField) - Visualization parameters
|
150 |
+
- `detections`: EmbeddedDocumentField(Detections) containing multiple Detection objects:
|
151 |
+
- `label`: UI element type (Icon, Text, Image, Toolbar, List Item)
|
152 |
+
- `bounding_box`: Coordinates [x, y, width, height]
|
153 |
+
- `content_or_function`: Text content or function name
|
154 |
+
- `clickable`: Boolean indicating interactivity
|
155 |
+
- `type`: Android widget type
|
156 |
+
- `resource_id`: Android resource identifier
|
157 |
+
|
158 |
+
The dataset provides comprehensive annotations of mobile UI elements with detailed information about their appearance, functionality, and interactive properties for machine learning applications.
|
159 |
+
|
160 |
## Dataset Creation
|
161 |
### Curation Rationale
|
162 |
The dataset was created to expose the semantic meaning of mobile UI elements - what they represent and how they function. While prior datasets captured visual design, this semantic layer enables deeper understanding of interface functionality across applications, supporting more advanced design tools and research.
|