IGNF
/

🌐 FLAIR-HUB Model Collection

  • Trained on: FLAIR-HUB dataset 🔗
  • Available modalities: Aerial images, SPOT images, Topographic info, Sentinel-2 yearly time-series, Sentinel-1 yearly time-series, Historical aerial images
  • Encoders: ConvNeXTV2, Swin (Tiny, Small, Base, Large)
  • Decoders: UNet, UPerNet
  • Tasks: Land-cover mapping (LC), Crop-type mapping (LPIS)
  • Class nomenclature: 15 classes for LC, 23 classes for LPIS
🆔
Model ID
🗺️
Land-cover
🌾
Crop-types
🛩️
Aerial
⛰️
Elevation
🛰️
SPOT
🛰️
S2 t.s.
🛰️
S1 t.s.
🛩️
Historical
LC-A
LC-D
LC-F
LC-G
LC-I
LC-L
LPIS-A
LPIS-F
LPIS-I
LPIS-J

🔍 Model: FLAIR-HUB_LPIS-F_utae

  • Encoder: UTAE
  • Decoder: UTAE
  • Metrics:
  • mIoU O.A. F-score Precision Recall
    21.75% 85.28% 28.74% 28.98% 31.90%
  • Params.: 0.9

General Informations


Training Config Hyperparameters

- Model architecture: UTAE
- Optimizer: AdamW (betas=[0.9, 0.999], weight_decay=0.01)
- Learning rate: 5e-5
- Scheduler: one_cycle_lr (warmup_fraction=0.2)
- Epochs: 150
- Batch size: 5
- Seed: 2025
- Early stopping: patience 20, monitor val_miou (mode=max)
- Class weights:
    - default: 1.0
    - masked classes: [clear cut, ligneous, mixed, other]  weight = 0
- Input channels:
    - SENTINEL2_TS : [1,2,3,4,5,6,7,8,9,10]

Training Data

- Train patches: 152225
- Validation patches: 38175
- Test patches: 50700
Classes distribution.

Training Logging

Training logging.

Metrics

Metric Value
mIoU 21.75%
Overall Accuracy 85.28%
F-score 28.74%
Precision 28.98%
Recall 31.90%
Class IoU (%) F-score (%) Precision (%) Recall (%)
grasses 43.13 60.27 64.86 56.28
wheat 59.61 74.70 66.33 85.47
barley 48.82 65.61 73.01 59.57
maize 68.82 81.53 73.99 90.78
other cereals 2.60 5.08 15.86 3.02
rice 0.00 0.00 0.00 0.00
flax/hemp/tobacco 0.00 0.00 0.00 0.00
sunflower 27.98 43.73 48.70 39.68
rapeseed 70.93 82.99 76.64 90.49
other oilseed crops 0.00 0.00 0.00 0.00
soy 12.37 22.02 14.49 45.84
other protein crops 20.86 34.52 27.86 45.35
fodder legumes 22.85 37.20 28.70 52.83
beetroots 1.51 2.98 17.46 1.63
potatoes 0.00 0.00 0.00 0.00
other arable crops 10.06 18.28 13.58 27.97
vineyard 24.52 39.38 37.92 40.96
olive groves 0.00 0.00 0.00 0.00
fruits orchards 0.00 0.00 0.00 0.00
nut orchards 0.00 0.00 0.00 0.00
other permanent crops 0.00 0.00 0.00 0.00
mixed crops 0.03 0.05 15.72 0.03
background 86.27 92.63 91.41 93.88

Inference

Aerial ROI

AERIAL

Inference ROI

INFERENCE

Cite

BibTeX:

@article{ign2025flairhub,
  doi = {10.48550/arXiv.2506.07080},
  url = {https://arxiv.org/abs/2506.07080},
  author = {Garioud, Anatol and Giordano, Sébastien and David, Nicolas and Gonthier, Nicolas},
  title = {FLAIR-HUB: Large-scale Multimodal Dataset for Land Cover and Crop Mapping},
  publisher = {arXiv},
  year = {2025}
}

APA:

Anatol Garioud, Sébastien Giordano, Nicolas David, Nicolas Gonthier. 
FLAIR-HUB: Large-scale Multimodal Dataset for Land Cover and Crop Mapping. (2025). 
DOI: https://doi.org/10.48550/arXiv.2506.07080
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including IGNF/FLAIR-HUB_LPIS-F_utae

Evaluation results