Dataset Viewer (First 5GB)
Auto-converted to Parquet
depth.npy
sequencelengths
448
448
jpg
imagewidth (px)
608
608
__key__
stringlengths
14
14
__url__
stringclasses
7 values
[["2.104","2.104","2.104","2.104","2.104","2.104","2.104","2.104","2.133","2.133","2.133","2.133","2(...TRUNCATED)
train_00000000
"hf://datasets/adams-story/nyu-depthv2-wds@507fbf1c8623803bdac4466913495bfb8950d619/nyu-depth-train-(...TRUNCATED)
[["2.162","2.174","2.174","2.174","2.174","2.174","2.174","2.174","2.174","2.174","2.174","2.174","2(...TRUNCATED)
train_00000001
"hf://datasets/adams-story/nyu-depthv2-wds@507fbf1c8623803bdac4466913495bfb8950d619/nyu-depth-train-(...TRUNCATED)
[["0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0(...TRUNCATED)
train_00000002
"hf://datasets/adams-story/nyu-depthv2-wds@507fbf1c8623803bdac4466913495bfb8950d619/nyu-depth-train-(...TRUNCATED)
[["3.89","3.89","3.89","3.89","3.89","3.89","3.89","3.89","3.89","3.89","3.89","3.89","3.89","3.89",(...TRUNCATED)
train_00000003
"hf://datasets/adams-story/nyu-depthv2-wds@507fbf1c8623803bdac4466913495bfb8950d619/nyu-depth-train-(...TRUNCATED)
[["1.425","1.425","1.425","1.425","1.425","1.425","1.425","1.425","1.425","1.438","1.438","1.438","1(...TRUNCATED)
train_00000004
"hf://datasets/adams-story/nyu-depthv2-wds@507fbf1c8623803bdac4466913495bfb8950d619/nyu-depth-train-(...TRUNCATED)
[["2.865","2.865","2.865","2.865","2.865","2.865","2.865","2.865","2.865","2.865","2.865","2.865","2(...TRUNCATED)
train_00000005
"hf://datasets/adams-story/nyu-depthv2-wds@507fbf1c8623803bdac4466913495bfb8950d619/nyu-depth-train-(...TRUNCATED)
[["2.611","2.611","2.611","2.611","2.611","2.611","2.611","2.623","2.611","2.611","2.611","2.623","2(...TRUNCATED)
train_00000006
"hf://datasets/adams-story/nyu-depthv2-wds@507fbf1c8623803bdac4466913495bfb8950d619/nyu-depth-train-(...TRUNCATED)
[["2.975","2.975","2.975","2.975","2.975","2.975","2.975","2.988","2.988","2.988","2.988","2.988","2(...TRUNCATED)
train_00000007
"hf://datasets/adams-story/nyu-depthv2-wds@507fbf1c8623803bdac4466913495bfb8950d619/nyu-depth-train-(...TRUNCATED)
[["2.945","2.945","2.945","2.945","2.945","2.945","2.945","2.945","2.945","2.945","2.945","2.945","2(...TRUNCATED)
train_00000008
"hf://datasets/adams-story/nyu-depthv2-wds@507fbf1c8623803bdac4466913495bfb8950d619/nyu-depth-train-(...TRUNCATED)
[["0.786","0.786","0.786","0.786","0.786","0.786","0.786","0.786","0.786","0.786","0.786","0.786","0(...TRUNCATED)
train_00000009
"hf://datasets/adams-story/nyu-depthv2-wds@507fbf1c8623803bdac4466913495bfb8950d619/nyu-depth-train-(...TRUNCATED)
End of preview. Expand in Data Studio

Dataset Card for nyu-depthv2-wds

This is the NYU DepthV2 dataset, converted into the webdataset format. https://huggingface.co/datasets/sayakpaul/nyu_depth_v2/

There are 47584 samples in the training split, and 654 samples in the validation split.

I shuffled both the training samples, and the validation samples.

I also cropped 16 pixels from all sides of the image, and depth image. I did this because there is a white border around all images.

This is an example of the border artifacts in the images: image/png

Dataset Details

Dataset Description

Each sample contains a jpg image, and a numpy array of depth data. Depth maps are stored in float16.

Dataset Sources [optional]

See https://cs.nyu.edu/~silberman/datasets/nyu_depth_v2.html

Uses

Train a depth prediction model on the train split, and test on the val split!

Downloads last month
3,629