BIFOLD BigEarthNet v2.0
AI & ML interests
None defined yet.
Recent Activity
BigEarthNet v2.0 Pretrained Weights
We provide pretrained weights for several different models. The weights for the best-performing model, based on the Macro Average Precision score on the recommended test split, have been uploaded. All models have been trained using: i) Sentinel-1 data only (S1), ii) Sentinel-2 data only (S2), or iii) both Sentinel-1 and Sentinel-2 (S1+S2) modalities together.
The following bands (in the specified order) were used to train the models with version 0.2.0:
- For models using Sentinel-1 only: Sentinel-1 bands
["VV", "VH"]
- For models using Sentinel-2 only: Sentinel-2 10m bands and 20m bands
["B02", "B03", "B04", "B05", "B06", "B07", "B08", "B8A", "B11", "B12"]
- For models using Sentinel-1 and Sentinel-2: Sentinel-1 bands and Sentinel-2 10m bands and 20m bands
["VV", "VH", "B02", "B03", "B04", "B05", "B06", "B07", "B08", "B8A", "B11", "B12"]
NOTE: Older versions of the models have been trained with different band orders that are not compatible with the current version and do not match the order proposed in the technical documentation of Sentinel-2.
The following bands (in the specified order) were used to train the models with version 0.1.1:
- For models using Sentinel-1 only: Sentinel-1 bands
["VH", "VV"]
- For models using Sentinel-2 only: Sentinel-2 10m bands and 20m bands
["B02", "B03", "B04", "B08", "B05", "B06", "B07", "B11", "B12", "B8A"]
- For models using Sentinel-1 and Sentinel-2: Sentinel-2 10m bands and 20m bands and Sentinel-1 bands =
["B02", "B03", "B04", "B08", "B05", "B06", "B07", "B11", "B12", "B8A", "VH", "VV"]
The multi-hot encoded output of the model indicates the predicted multi-label output.
The multi-hot encoded output relates to the following classes sorted in alphabetical order:['Agro-forestry areas', 'Arable land', 'Beaches, dunes, sands', 'Broad-leaved forest', 'Coastal wetlands', 'Complex cultivation patterns', 'Coniferous forest', 'Industrial or commercial units', 'Inland waters', 'Inland wetlands', 'Land principally occupied by agriculture, with significant areas of natural vegetation', 'Marine waters', 'Mixed forest', 'Moors, heathland and sclerophyllous vegetation', 'Natural grassland and sparsely vegetated areas', 'Pastures', 'Permanent crops', 'Transitional woodland, shrub', 'Urban fabric']
To use the model, download the codes that define the model architecture from the
official BigEarthNet v2.0 (reBEN) repository and load the model
using the code below. Note that configilm
is a requirement to use the
code below.
from reben_publication.BigEarthNetv2_0_ImageClassifier import BigEarthNetv2_0_ImageClassifier
model = BigEarthNetv2_0_ImageClassifier.from_pretrained("path_to/huggingface_model_folder")
e.g.
from reben_publication.BigEarthNetv2_0_ImageClassifier import BigEarthNetv2_0_ImageClassifier
model = BigEarthNetv2_0_ImageClassifier.from_pretrained(
"BIFOLD-BigEarthNetv2-0/resnet50-s2-v0.2.0"
)
If you use any of these models in your research, please cite the following papers:
K. Clasen, L. Hackel, T. Burgert, G. Sumbul, B. Demir, V. Markl, "reBEN: Refined BigEarthNet Dataset for Remote Sensing Image Analysis", IEEE International Geoscience and Remote Sensing Symposium (IGARSS), 2025.
@inproceedings{clasen2025refinedbigearthnet,
title={{reBEN}: Refined BigEarthNet Dataset for Remote Sensing Image Analysis},
author={Clasen, Kai Norman and Hackel, Leonard and Burgert, Tom and Sumbul, Gencer and Demir, Beg{\"u}m and Markl, Volker},
year={2025},
booktitle={IEEE International Geoscience and Remote Sensing Symposium (IGARSS)},
}
L. Hackel, K. Clasen, B. Demir, "ConfigILM: A General Purpose Configurable Library for Combining Image and Language Models for Visual Question Answering.", SoftwareX 26 (2024): 101731.
@article{hackel2024configilm,
title={ConfigILM: A general purpose configurable library for combining image and language models for visual question answering},
author={Hackel, Leonard and Clasen, Kai Norman and Demir, Beg{\"u}m},
journal={SoftwareX},
volume={26},
pages={101731},
year={2024},
publisher={Elsevier}
}