---
tags:
- depth-estimation
library_name: coreml
license: apple-ascl
base_model:
  - apple/DepthPro
---

This repo contains [DepthProNormalizedInverseDepth.mlpackage](DepthProNormalizedInverseDepth.mlpackage)(1290 MB).

`Normaized Inverse Depth` means that the model will output values between $[0,1]$ where 1 is the closest pixel to the camera, and 0 is the furthest pixel from the camera.

# DepthPro CoreML Models

DepthPro is a monocular depth estimation model. This means that it is trained to predict depth on a single image.

[DepthPro paper](https://arxiv.org/pdf/2410.02073)

[DepthPro original repo](https://huggingface.co/apple/DepthPro)

# Model Inputs and Outputs

### Inputs

- `image`: $1536 \times 1536$ 3 color image ($[1 \times 3 \times 1536 \times 1536]$ ImageType).

### Outputs

- `normalizedInverseDepth` 1536x1536 monochrome image ($[1 \times 1 \times 1536 \times 1536]$ ImageType).

# Download

Install `huggingface-cli`

```bash
brew install huggingface-cli
```

To download:

```bash
huggingface-cli download \
  --local-dir models --local-dir-use-symlinks False \
  coreml-projects/DepthPro-coreml-normalized-inverse-depth \
  --include "DepthProNormalizedInverseDepth.mlpackage/*""
```

To download everything, skip the `--include` argument.

# Conversion Tutorial 

The [`huggingface/coreml-examples`](https://github.com/huggingface/coreml-examples/blob/main/tutorials/DepthPro/depth_pro_coreml_guide.ipynb) repository contains sample conversion code for `DepthProNormalizedInverseDepth.mlpackage` and other models.

# Swift Integration

The [`huggingface/coreml-examples`](https://github.com/huggingface/coreml-examples/blob/main/DepthProSample/README.md) repository contains sample Swift code for `DepthProNormalizedInverseDepth.mlpackage` and other models. See [the instructions there](https://github.com/huggingface/coreml-examples/tree/main/DepthProSample) to build the demo app, which shows how to use the model in your own Swift apps.