segformer-b5-finetuned-ade20k-hgo-coord_40epochs_distortion_ver2_global_norm_with_void_3
This model is a fine-tuned version of NICOPOI-9/segformer-b5-finetuned-ade20k-hgo-coord_40epochs_distortion_ver2_global_norm_with_void on the NICOPOI-9/Modphad_Perlin_two_void_coord_global_norm dataset. It achieves the following results on the evaluation set:
- Loss: 0.5837
- Mean Iou: 0.7559
- Mean Accuracy: 0.8600
- Overall Accuracy: 0.8723
- Accuracy [0,0]: 0.8245
- Accuracy [0,1]: 0.8968
- Accuracy [1,0]: 0.9052
- Accuracy [1,1]: 0.8873
- Accuracy [0,2]: 0.8720
- Accuracy [0,3]: 0.8555
- Accuracy [1,2]: 0.8630
- Accuracy [1,3]: 0.9183
- Accuracy [2,0]: 0.8100
- Accuracy [2,1]: 0.8301
- Accuracy [2,2]: 0.8209
- Accuracy [2,3]: 0.8314
- Accuracy [3,0]: 0.8513
- Accuracy [3,1]: 0.7988
- Accuracy [3,2]: 0.8444
- Accuracy [3,3]: 0.8538
- Accuracy Void: 0.9565
- Iou [0,0]: 0.7600
- Iou [0,1]: 0.7777
- Iou [1,0]: 0.7713
- Iou [1,1]: 0.7672
- Iou [0,2]: 0.7510
- Iou [0,3]: 0.7562
- Iou [1,2]: 0.7433
- Iou [1,3]: 0.8024
- Iou [2,0]: 0.7175
- Iou [2,1]: 0.7317
- Iou [2,2]: 0.6962
- Iou [2,3]: 0.7339
- Iou [3,0]: 0.7656
- Iou [3,1]: 0.7285
- Iou [3,2]: 0.7081
- Iou [3,3]: 0.7214
- Iou Void: 0.9178
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 80
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy [0,0] | Accuracy [0,1] | Accuracy [1,0] | Accuracy [1,1] | Accuracy [0,2] | Accuracy [0,3] | Accuracy [1,2] | Accuracy [1,3] | Accuracy [2,0] | Accuracy [2,1] | Accuracy [2,2] | Accuracy [2,3] | Accuracy [3,0] | Accuracy [3,1] | Accuracy [3,2] | Accuracy [3,3] | Accuracy Void | Iou [0,0] | Iou [0,1] | Iou [1,0] | Iou [1,1] | Iou [0,2] | Iou [0,3] | Iou [1,2] | Iou [1,3] | Iou [2,0] | Iou [2,1] | Iou [2,2] | Iou [2,3] | Iou [3,0] | Iou [3,1] | Iou [3,2] | Iou [3,3] | Iou Void |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.152 | 7.3260 | 4000 | 0.7633 | 0.6318 | 0.7756 | 0.7910 | 0.7271 | 0.7253 | 0.8081 | 0.8546 | 0.8294 | 0.7726 | 0.7022 | 0.8399 | 0.6781 | 0.7753 | 0.7977 | 0.7897 | 0.8184 | 0.5306 | 0.7908 | 0.8273 | 0.9181 | 0.6549 | 0.6477 | 0.6603 | 0.6563 | 0.6384 | 0.6154 | 0.6197 | 0.6871 | 0.5669 | 0.6722 | 0.5917 | 0.6063 | 0.6417 | 0.4889 | 0.5758 | 0.5383 | 0.8792 |
0.4745 | 14.6520 | 8000 | 0.6847 | 0.6572 | 0.7918 | 0.8081 | 0.7585 | 0.7823 | 0.8711 | 0.8361 | 0.8139 | 0.7747 | 0.6589 | 0.8644 | 0.7717 | 0.8130 | 0.7244 | 0.7572 | 0.8354 | 0.6652 | 0.7897 | 0.8170 | 0.9266 | 0.6768 | 0.6619 | 0.6854 | 0.6303 | 0.6853 | 0.6383 | 0.6093 | 0.6536 | 0.6602 | 0.5989 | 0.6355 | 0.6343 | 0.6964 | 0.6043 | 0.6389 | 0.5706 | 0.8930 |
0.2779 | 21.9780 | 12000 | 0.6080 | 0.6946 | 0.8203 | 0.8326 | 0.7822 | 0.8660 | 0.8674 | 0.8758 | 0.8260 | 0.8204 | 0.8000 | 0.8680 | 0.7232 | 0.7985 | 0.8629 | 0.7841 | 0.8052 | 0.7116 | 0.8226 | 0.8120 | 0.9190 | 0.7199 | 0.7346 | 0.7254 | 0.6937 | 0.6854 | 0.6604 | 0.6958 | 0.7238 | 0.6568 | 0.6896 | 0.6128 | 0.6912 | 0.7207 | 0.6371 | 0.6627 | 0.6076 | 0.8904 |
0.0837 | 29.3040 | 16000 | 0.5950 | 0.7067 | 0.8279 | 0.8404 | 0.7697 | 0.8721 | 0.8696 | 0.8884 | 0.8008 | 0.8170 | 0.7844 | 0.8911 | 0.8288 | 0.7954 | 0.7810 | 0.7962 | 0.8401 | 0.7972 | 0.7915 | 0.8268 | 0.9235 | 0.6886 | 0.7537 | 0.6917 | 0.7075 | 0.7107 | 0.6793 | 0.6946 | 0.7357 | 0.7132 | 0.6818 | 0.6766 | 0.7030 | 0.6825 | 0.6735 | 0.6497 | 0.6747 | 0.8970 |
0.1192 | 36.6300 | 20000 | 0.6074 | 0.7169 | 0.8348 | 0.8475 | 0.8083 | 0.8478 | 0.8681 | 0.8823 | 0.8403 | 0.8282 | 0.8310 | 0.9117 | 0.7915 | 0.8006 | 0.7987 | 0.7866 | 0.8475 | 0.7392 | 0.8417 | 0.8336 | 0.9345 | 0.7405 | 0.7410 | 0.7164 | 0.7271 | 0.7169 | 0.6911 | 0.7208 | 0.7466 | 0.6942 | 0.7163 | 0.6798 | 0.6821 | 0.7093 | 0.6520 | 0.6803 | 0.6701 | 0.9026 |
0.0538 | 43.9560 | 24000 | 0.6011 | 0.7190 | 0.8335 | 0.8496 | 0.8453 | 0.8891 | 0.8780 | 0.8929 | 0.8250 | 0.7916 | 0.7874 | 0.8868 | 0.8056 | 0.8231 | 0.7849 | 0.7742 | 0.8565 | 0.7412 | 0.7993 | 0.8380 | 0.9511 | 0.7570 | 0.6969 | 0.7359 | 0.7488 | 0.7296 | 0.6807 | 0.6977 | 0.7651 | 0.7055 | 0.6898 | 0.6898 | 0.6849 | 0.7412 | 0.6480 | 0.6828 | 0.6554 | 0.9143 |
0.0616 | 51.2821 | 28000 | 0.5992 | 0.7349 | 0.8451 | 0.8586 | 0.8345 | 0.9088 | 0.8899 | 0.8821 | 0.8455 | 0.8010 | 0.8537 | 0.8853 | 0.7653 | 0.8475 | 0.8179 | 0.8252 | 0.8186 | 0.7725 | 0.8359 | 0.8324 | 0.9512 | 0.7540 | 0.7328 | 0.7120 | 0.7596 | 0.7493 | 0.7131 | 0.7134 | 0.7649 | 0.6818 | 0.7506 | 0.7135 | 0.7105 | 0.7632 | 0.7053 | 0.6837 | 0.6767 | 0.9098 |
0.0471 | 58.6081 | 32000 | 0.5636 | 0.7472 | 0.8530 | 0.8671 | 0.8347 | 0.8966 | 0.9209 | 0.8917 | 0.8560 | 0.8639 | 0.8388 | 0.9071 | 0.7870 | 0.8462 | 0.8135 | 0.8072 | 0.8454 | 0.7765 | 0.8370 | 0.8212 | 0.9580 | 0.7641 | 0.7822 | 0.7606 | 0.7573 | 0.7466 | 0.7331 | 0.7421 | 0.7789 | 0.6958 | 0.7347 | 0.6971 | 0.7076 | 0.7581 | 0.7122 | 0.7289 | 0.6887 | 0.9149 |
0.101 | 65.9341 | 36000 | 0.5694 | 0.7514 | 0.8573 | 0.8696 | 0.8289 | 0.8963 | 0.8971 | 0.8826 | 0.8476 | 0.8269 | 0.8581 | 0.9075 | 0.8128 | 0.8565 | 0.8462 | 0.8263 | 0.8474 | 0.8111 | 0.8374 | 0.8324 | 0.9600 | 0.7687 | 0.7604 | 0.7533 | 0.7885 | 0.7427 | 0.7313 | 0.7500 | 0.7977 | 0.7090 | 0.7323 | 0.6802 | 0.7167 | 0.7713 | 0.7278 | 0.7082 | 0.7185 | 0.9169 |
0.0344 | 73.2601 | 40000 | 0.5837 | 0.7559 | 0.8600 | 0.8723 | 0.8245 | 0.8968 | 0.9052 | 0.8873 | 0.8720 | 0.8555 | 0.8630 | 0.9183 | 0.8100 | 0.8301 | 0.8209 | 0.8314 | 0.8513 | 0.7988 | 0.8444 | 0.8538 | 0.9565 | 0.7600 | 0.7777 | 0.7713 | 0.7672 | 0.7510 | 0.7562 | 0.7433 | 0.8024 | 0.7175 | 0.7317 | 0.6962 | 0.7339 | 0.7656 | 0.7285 | 0.7081 | 0.7214 | 0.9178 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 31
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support