NICOPOI-9's picture
End of training
3073a1b verified
|
raw
history blame
8.21 kB
metadata
library_name: transformers
license: other
base_model: nvidia/mit-b0
tags:
  - vision
  - image-segmentation
  - generated_from_trainer
model-index:
  - name: segformer-b0-finetuned-morphpadver1-hgo-coord-v9_mix_resample_40epochs
    results: []

segformer-b0-finetuned-morphpadver1-hgo-coord-v9_mix_resample_40epochs

This model is a fine-tuned version of nvidia/mit-b0 on the NICOPOI-9/morphpad_coord_hgo_512_4class_v2 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8517
  • Mean Iou: 0.5269
  • Mean Accuracy: 0.6857
  • Overall Accuracy: 0.6922
  • Accuracy 0-0: 0.5774
  • Accuracy 0-90: 0.7340
  • Accuracy 90-0: 0.7692
  • Accuracy 90-90: 0.6625
  • Iou 0-0: 0.4919
  • Iou 0-90: 0.5434
  • Iou 90-0: 0.5346
  • Iou 90-90: 0.5380

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy 0-0 Accuracy 0-90 Accuracy 90-0 Accuracy 90-90 Iou 0-0 Iou 0-90 Iou 90-0 Iou 90-90
1.4178 1.3638 4000 1.4017 0.0915 0.2543 0.2763 0.0012 0.1224 0.8886 0.0052 0.0012 0.0995 0.2601 0.0052
1.1726 2.7276 8000 1.3327 0.2022 0.3465 0.3633 0.1781 0.4945 0.5489 0.1646 0.1323 0.2709 0.2783 0.1273
1.3111 4.0914 12000 1.3034 0.2235 0.3684 0.3811 0.2368 0.5790 0.4089 0.2489 0.1621 0.3039 0.2584 0.1694
1.033 5.4552 16000 1.2837 0.2340 0.3853 0.4006 0.1897 0.5159 0.5735 0.2619 0.1526 0.2965 0.3101 0.1769
1.3103 6.8190 20000 1.2502 0.2593 0.4171 0.4339 0.2446 0.6314 0.5467 0.2456 0.1839 0.3469 0.3212 0.1851
0.6831 8.1827 24000 1.2405 0.2655 0.4238 0.4336 0.2449 0.3989 0.6621 0.3893 0.1849 0.2957 0.3333 0.2482
1.1638 9.5465 28000 1.1866 0.2955 0.4566 0.4696 0.3300 0.6108 0.5695 0.3160 0.2396 0.3484 0.3479 0.2463
1.2145 10.9103 32000 1.1129 0.3356 0.5008 0.5092 0.4052 0.5818 0.5926 0.4236 0.2913 0.3764 0.3705 0.3042
0.767 12.2741 36000 1.1059 0.3423 0.5078 0.5144 0.4463 0.5576 0.5978 0.4295 0.3098 0.3613 0.3732 0.3250
1.0089 13.6379 40000 1.0832 0.3500 0.5157 0.5252 0.4129 0.6431 0.5790 0.4280 0.3054 0.3812 0.3870 0.3263
1.0757 15.0017 44000 1.0207 0.3866 0.5553 0.5626 0.4802 0.6133 0.6502 0.4776 0.3529 0.4112 0.4246 0.3577
0.8842 16.3655 48000 1.0716 0.3737 0.5417 0.5529 0.4372 0.6738 0.6390 0.4169 0.3371 0.4152 0.4191 0.3234
0.8464 17.7293 52000 1.0188 0.4101 0.5795 0.5884 0.4262 0.6296 0.7147 0.5474 0.3467 0.4325 0.4543 0.4069
0.8371 19.0931 56000 0.9905 0.4260 0.5942 0.6027 0.4614 0.6846 0.6765 0.5542 0.3766 0.4455 0.4655 0.4166
0.7882 20.4569 60000 0.9542 0.4454 0.6126 0.6216 0.4838 0.6737 0.7397 0.5530 0.3925 0.4665 0.4815 0.4411
2.4763 21.8207 64000 0.9188 0.4671 0.6330 0.6402 0.5338 0.6708 0.7484 0.5788 0.4359 0.4820 0.4932 0.4572
0.3528 23.1845 68000 0.8817 0.4725 0.6381 0.6450 0.5270 0.6813 0.7379 0.6063 0.4314 0.4937 0.4905 0.4745
0.8088 24.5482 72000 0.9115 0.4800 0.6458 0.6500 0.5807 0.6461 0.7349 0.6217 0.4507 0.4844 0.4938 0.4912
0.8153 25.9120 76000 0.9558 0.4531 0.6215 0.6342 0.3715 0.7059 0.7951 0.6135 0.3382 0.5023 0.4889 0.4829
0.9085 27.2758 80000 0.9089 0.4777 0.6415 0.6542 0.4936 0.7556 0.7928 0.5238 0.4312 0.5149 0.5148 0.4500
0.3666 28.6396 84000 1.0426 0.4467 0.6141 0.6270 0.3862 0.6873 0.8064 0.5767 0.3460 0.5000 0.4754 0.4654
0.6065 30.0034 88000 0.9086 0.4850 0.6497 0.6557 0.5433 0.6885 0.7346 0.6323 0.4404 0.5002 0.5009 0.4985
0.1385 31.3672 92000 0.9247 0.4688 0.6343 0.6469 0.4228 0.7420 0.7832 0.5892 0.3792 0.5132 0.4999 0.4829
0.4116 32.7310 96000 0.8724 0.5014 0.6628 0.6707 0.5288 0.6729 0.8213 0.6281 0.4585 0.5268 0.5094 0.5112
0.4991 34.0948 100000 0.8752 0.5078 0.6693 0.6766 0.5435 0.7342 0.7515 0.6480 0.4584 0.5274 0.5232 0.5225
0.5235 35.4586 104000 0.8312 0.5135 0.6736 0.6814 0.6179 0.7514 0.7598 0.5651 0.5060 0.5362 0.5256 0.4861
0.6378 36.8224 108000 0.8729 0.5102 0.6705 0.6784 0.5636 0.7161 0.7926 0.6097 0.4781 0.5335 0.5216 0.5076
0.6895 38.1862 112000 0.9258 0.4833 0.6466 0.6600 0.4375 0.7335 0.8392 0.5761 0.3990 0.5343 0.5097 0.4903
0.5259 39.5499 116000 0.8517 0.5269 0.6857 0.6922 0.5774 0.7340 0.7692 0.6625 0.4919 0.5434 0.5346 0.5380

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.1.0
  • Datasets 3.2.0
  • Tokenizers 0.21.0