File size: 8,815 Bytes
24c1a12
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
---
library_name: transformers
license: other
base_model: tferhan/segformer-b1-finetuned-UBC
tags:
- generated_from_trainer
model-index:
- name: segformer-b4-finetuned-UBC-two
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b4-finetuned-UBC-two

This model is a fine-tuned version of [tferhan/segformer-b1-finetuned-UBC](https://huggingface.co/tferhan/segformer-b1-finetuned-UBC) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2500
- Mean Iou: 0.4701
- Mean Accuracy: 0.7752
- Overall Accuracy: 0.7757
- Accuracy Background: nan
- Accuracy Residential: 0.7621
- Accuracy Non-residential: 0.7884
- Iou Background: 0.0
- Iou Residential: 0.7025
- Iou Non-residential: 0.7079

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Residential | Accuracy Non-residential | Iou Background | Iou Residential | Iou Non-residential |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:--------------------:|:------------------------:|:--------------:|:---------------:|:-------------------:|
| 0.6815        | 1.0   | 140  | 0.6220          | 0.4724   | 0.8023        | 0.8038           | nan                 | 0.7624               | 0.8421                   | 0.0            | 0.6996          | 0.7176              |
| 0.3859        | 2.0   | 280  | 0.3193          | 0.4353   | 0.7151        | 0.7159           | nan                 | 0.6956               | 0.7347                   | 0.0            | 0.6443          | 0.6615              |
| 0.293         | 3.0   | 420  | 0.2738          | 0.4382   | 0.7323        | 0.7340           | nan                 | 0.6863               | 0.7783                   | 0.0            | 0.6376          | 0.6771              |
| 0.2697        | 4.0   | 560  | 0.2571          | 0.4279   | 0.7076        | 0.7065           | nan                 | 0.7356               | 0.6796                   | 0.0            | 0.6563          | 0.6273              |
| 0.2517        | 5.0   | 700  | 0.2489          | 0.4453   | 0.7395        | 0.7413           | nan                 | 0.6923               | 0.7867                   | 0.0            | 0.6551          | 0.6808              |
| 0.211         | 6.0   | 840  | 0.2387          | 0.4451   | 0.7356        | 0.7351           | nan                 | 0.7481               | 0.7231                   | 0.0            | 0.6786          | 0.6566              |
| 0.2117        | 7.0   | 980  | 0.2348          | 0.4531   | 0.7430        | 0.7427           | nan                 | 0.7498               | 0.7361                   | 0.0            | 0.6863          | 0.6731              |
| 0.1924        | 8.0   | 1120 | 0.2348          | 0.4649   | 0.7667        | 0.7678           | nan                 | 0.7389               | 0.7946                   | 0.0            | 0.6884          | 0.7062              |
| 0.2213        | 9.0   | 1260 | 0.2401          | 0.4650   | 0.7665        | 0.7674           | nan                 | 0.7423               | 0.7906                   | 0.0            | 0.6938          | 0.7011              |
| 0.2005        | 10.0  | 1400 | 0.2330          | 0.4534   | 0.7480        | 0.7484           | nan                 | 0.7377               | 0.7582                   | 0.0            | 0.6796          | 0.6805              |
| 0.1723        | 11.0  | 1540 | 0.2453          | 0.4521   | 0.7484        | 0.7506           | nan                 | 0.6889               | 0.8079                   | 0.0            | 0.6613          | 0.6951              |
| 0.1924        | 12.0  | 1680 | 0.2311          | 0.4576   | 0.7550        | 0.7545           | nan                 | 0.7695               | 0.7405                   | 0.0            | 0.6963          | 0.6764              |
| 0.1843        | 13.0  | 1820 | 0.2418          | 0.4690   | 0.7711        | 0.7726           | nan                 | 0.7307               | 0.8115                   | 0.0            | 0.6907          | 0.7165              |
| 0.1839        | 14.0  | 1960 | 0.2344          | 0.4631   | 0.7647        | 0.7658           | nan                 | 0.7362               | 0.7932                   | 0.0            | 0.6845          | 0.7046              |
| 0.1698        | 15.0  | 2100 | 0.2378          | 0.4627   | 0.7604        | 0.7609           | nan                 | 0.7450               | 0.7758                   | 0.0            | 0.6876          | 0.7006              |
| 0.1832        | 16.0  | 2240 | 0.2423          | 0.4594   | 0.7571        | 0.7585           | nan                 | 0.7182               | 0.7960                   | 0.0            | 0.6796          | 0.6984              |
| 0.1501        | 17.0  | 2380 | 0.2432          | 0.4693   | 0.7717        | 0.7730           | nan                 | 0.7378               | 0.8056                   | 0.0            | 0.6924          | 0.7156              |
| 0.1788        | 18.0  | 2520 | 0.2474          | 0.4785   | 0.7877        | 0.7886           | nan                 | 0.7647               | 0.8108                   | 0.0            | 0.7134          | 0.7220              |
| 0.1524        | 19.0  | 2660 | 0.2513          | 0.4715   | 0.7809        | 0.7827           | nan                 | 0.7330               | 0.8287                   | 0.0            | 0.6907          | 0.7237              |
| 0.1669        | 20.0  | 2800 | 0.2421          | 0.4685   | 0.7723        | 0.7726           | nan                 | 0.7659               | 0.7788                   | 0.0            | 0.7036          | 0.7020              |
| 0.1539        | 21.0  | 2940 | 0.2449          | 0.4706   | 0.7736        | 0.7744           | nan                 | 0.7515               | 0.7956                   | 0.0            | 0.6975          | 0.7142              |
| 0.1664        | 22.0  | 3080 | 0.2433          | 0.4685   | 0.7710        | 0.7716           | nan                 | 0.7549               | 0.7872                   | 0.0            | 0.7001          | 0.7054              |
| 0.1717        | 23.0  | 3220 | 0.2463          | 0.4658   | 0.7679        | 0.7683           | nan                 | 0.7561               | 0.7797                   | 0.0            | 0.6953          | 0.7020              |
| 0.1477        | 24.0  | 3360 | 0.2464          | 0.4671   | 0.7710        | 0.7717           | nan                 | 0.7513               | 0.7906                   | 0.0            | 0.6960          | 0.7054              |
| 0.1491        | 25.0  | 3500 | 0.2457          | 0.4663   | 0.7675        | 0.7679           | nan                 | 0.7567               | 0.7782                   | 0.0            | 0.6990          | 0.6998              |
| 0.1601        | 26.0  | 3640 | 0.2452          | 0.4699   | 0.7739        | 0.7746           | nan                 | 0.7554               | 0.7924                   | 0.0            | 0.7015          | 0.7082              |
| 0.1436        | 27.0  | 3780 | 0.2538          | 0.4657   | 0.7683        | 0.7689           | nan                 | 0.7526               | 0.7841                   | 0.0            | 0.6953          | 0.7018              |
| 0.1487        | 28.0  | 3920 | 0.2489          | 0.4668   | 0.7702        | 0.7707           | nan                 | 0.7567               | 0.7837                   | 0.0            | 0.6986          | 0.7018              |
| 0.1514        | 29.0  | 4060 | 0.2456          | 0.4686   | 0.7730        | 0.7735           | nan                 | 0.7583               | 0.7877                   | 0.0            | 0.6999          | 0.7060              |
| 0.1519        | 30.0  | 4200 | 0.2500          | 0.4701   | 0.7752        | 0.7757           | nan                 | 0.7621               | 0.7884                   | 0.0            | 0.7025          | 0.7079              |


### Framework versions

- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.2