File size: 6,672 Bytes
d5b97f8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
---

library_name: transformers
license: other
base_model: nvidia/mit-b3
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-morphpadver1-hgo-coord-v3_1
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b0-finetuned-morphpadver1-hgo-coord-v3_1



This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the NICOPOI-9/morphpad_coord_hgo_512_4class_v2 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0117
- Mean Iou: 0.9981
- Mean Accuracy: 0.9990
- Overall Accuracy: 0.9990
- Accuracy 0-0: 0.9995
- Accuracy 0-90: 0.9985
- Accuracy 90-0: 0.9988
- Accuracy 90-90: 0.9993
- Iou 0-0: 0.9991
- Iou 0-90: 0.9979
- Iou 90-0: 0.9976
- Iou 90-90: 0.9978

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05

- train_batch_size: 1

- eval_batch_size: 1

- seed: 42

- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments

- lr_scheduler_type: linear

- num_epochs: 60

### Training results

| Training Loss | Epoch   | Step  | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
|:-------------:|:-------:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------:|:-------------:|:-------------:|:--------------:|:-------:|:--------:|:--------:|:---------:|
| 0.903         | 2.6525  | 4000  | 0.8952          | 0.3916   | 0.5570        | 0.5567           | 0.5335       | 0.6125        | 0.4890        | 0.5929         | 0.4534  | 0.3418   | 0.3411   | 0.4299    |
| 0.6373        | 5.3050  | 8000  | 0.5078          | 0.6237   | 0.7643        | 0.7643           | 0.7676       | 0.8339        | 0.6741        | 0.7817         | 0.6758  | 0.5472   | 0.6022   | 0.6698    |
| 0.2851        | 7.9576  | 12000 | 0.2955          | 0.7612   | 0.8642        | 0.8642           | 0.8669       | 0.8687        | 0.8339        | 0.8874         | 0.7959  | 0.7358   | 0.7500   | 0.7631    |
| 0.2309        | 10.6101 | 16000 | 0.1305          | 0.9184   | 0.9574        | 0.9574           | 0.9575       | 0.9381        | 0.9648        | 0.9692         | 0.9333  | 0.9074   | 0.8991   | 0.9337    |
| 0.0907        | 13.2626 | 20000 | 0.1249          | 0.9267   | 0.9620        | 0.9620           | 0.9636       | 0.9541        | 0.9594        | 0.9708         | 0.9379  | 0.9205   | 0.9169   | 0.9316    |
| 0.3051        | 15.9151 | 24000 | 0.0529          | 0.9675   | 0.9835        | 0.9835           | 0.9842       | 0.9805        | 0.9839        | 0.9854         | 0.9712  | 0.9636   | 0.9626   | 0.9728    |
| 0.0659        | 18.5676 | 28000 | 0.0630          | 0.9670   | 0.9832        | 0.9833           | 0.9852       | 0.9747        | 0.9885        | 0.9846         | 0.9719  | 0.9642   | 0.9633   | 0.9687    |
| 0.0474        | 21.2202 | 32000 | 0.0454          | 0.9768   | 0.9882        | 0.9883           | 0.9910       | 0.9856        | 0.9865        | 0.9899         | 0.9783  | 0.9737   | 0.9747   | 0.9805    |
| 0.0449        | 23.8727 | 36000 | 0.0468          | 0.9795   | 0.9896        | 0.9896           | 0.9900       | 0.9812        | 0.9900        | 0.9973         | 0.9828  | 0.9743   | 0.9783   | 0.9824    |
| 0.0552        | 26.5252 | 40000 | 0.0266          | 0.9884   | 0.9942        | 0.9942           | 0.9949       | 0.9917        | 0.9947        | 0.9953         | 0.9888  | 0.9865   | 0.9866   | 0.9916    |
| 0.0541        | 29.1777 | 44000 | 0.0290          | 0.9908   | 0.9954        | 0.9954           | 0.9951       | 0.9951        | 0.9967        | 0.9946         | 0.9921  | 0.9897   | 0.9905   | 0.9909    |
| 0.0082        | 31.8302 | 48000 | 0.0421          | 0.9891   | 0.9945        | 0.9945           | 0.9940       | 0.9924        | 0.9951        | 0.9966         | 0.9908  | 0.9869   | 0.9884   | 0.9904    |
| 0.0061        | 34.4828 | 52000 | 0.0345          | 0.9923   | 0.9961        | 0.9961           | 0.9971       | 0.9941        | 0.9966        | 0.9966         | 0.9939  | 0.9912   | 0.9916   | 0.9922    |
| 0.0053        | 37.1353 | 56000 | 0.0256          | 0.9941   | 0.9970        | 0.9970           | 0.9976       | 0.9972        | 0.9966        | 0.9968         | 0.9957  | 0.9928   | 0.9929   | 0.9949    |
| 0.0045        | 39.7878 | 60000 | 0.0256          | 0.9937   | 0.9968        | 0.9968           | 0.9978       | 0.9959        | 0.9959        | 0.9978         | 0.9937  | 0.9927   | 0.9926   | 0.9957    |
| 0.0046        | 42.4403 | 64000 | 0.0171          | 0.9964   | 0.9982        | 0.9982           | 0.9983       | 0.9976        | 0.9987        | 0.9981         | 0.9972  | 0.9958   | 0.9955   | 0.9969    |
| 0.0032        | 45.0928 | 68000 | 0.0293          | 0.9957   | 0.9979        | 0.9979           | 0.9983       | 0.9969        | 0.9975        | 0.9988         | 0.9966  | 0.9950   | 0.9950   | 0.9964    |
| 0.003         | 47.7454 | 72000 | 0.0251          | 0.9964   | 0.9982        | 0.9982           | 0.9984       | 0.9973        | 0.9984        | 0.9987         | 0.9973  | 0.9952   | 0.9965   | 0.9966    |
| 0.0035        | 50.3979 | 76000 | 0.0245          | 0.9973   | 0.9986        | 0.9986           | 0.9993       | 0.9982        | 0.9983        | 0.9987         | 0.9982  | 0.9969   | 0.9963   | 0.9977    |
| 0.0025        | 53.0504 | 80000 | 0.0222          | 0.9972   | 0.9986        | 0.9986           | 0.9990       | 0.9980        | 0.9987        | 0.9986         | 0.9985  | 0.9965   | 0.9970   | 0.9968    |
| 0.0023        | 55.7029 | 84000 | 0.0104          | 0.9982   | 0.9991        | 0.9991           | 0.9994       | 0.9989        | 0.9987        | 0.9993         | 0.9988  | 0.9980   | 0.9975   | 0.9983    |
| 0.0022        | 58.3554 | 88000 | 0.0117          | 0.9981   | 0.9990        | 0.9990           | 0.9995       | 0.9985        | 0.9988        | 0.9993         | 0.9991  | 0.9979   | 0.9976   | 0.9978    |


### Framework versions

- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0