segformer-b0-finetuned-morphpadver1-hgo-30-coord-v3_120epochs

This model is a fine-tuned version of nvidia/mit-b3 on the NICOPOI-9/morphpad_coord_hgo_30_30_512_4class dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3128
  • Mean Iou: 0.7857
  • Mean Accuracy: 0.8800
  • Overall Accuracy: 0.8799
  • Accuracy 0-0: 0.8782
  • Accuracy 0-90: 0.8842
  • Accuracy 90-0: 0.8809
  • Accuracy 90-90: 0.8765
  • Iou 0-0: 0.7869
  • Iou 0-90: 0.7762
  • Iou 90-0: 0.7880
  • Iou 90-90: 0.7916

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 120

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy 0-0 Accuracy 0-90 Accuracy 90-0 Accuracy 90-90 Iou 0-0 Iou 0-90 Iou 90-0 Iou 90-90
1.188 4.2105 4000 1.2023 0.2499 0.3996 0.3995 0.3256 0.4999 0.4603 0.3124 0.2510 0.2542 0.2429 0.2514
0.9829 8.4211 8000 0.9446 0.3676 0.5326 0.5326 0.4568 0.5068 0.7157 0.4508 0.3872 0.3492 0.3434 0.3907
0.7539 12.6316 12000 0.7804 0.4475 0.6163 0.6168 0.5690 0.6618 0.5314 0.7030 0.4806 0.4244 0.4335 0.4513
0.969 16.8421 16000 0.6554 0.5152 0.6775 0.6780 0.6172 0.7013 0.6543 0.7372 0.5533 0.4849 0.5090 0.5136
0.5649 21.0526 20000 0.5879 0.5597 0.7156 0.7157 0.6964 0.7740 0.6842 0.7078 0.5869 0.5181 0.5555 0.5782
0.5983 25.2632 24000 0.5229 0.5963 0.7464 0.7460 0.8236 0.7139 0.7132 0.7348 0.5647 0.6058 0.6038 0.6110
0.5077 29.4737 28000 0.4971 0.6133 0.7587 0.7585 0.7767 0.7577 0.7964 0.7042 0.6236 0.6053 0.5724 0.6519
0.456 33.6842 32000 0.4880 0.6346 0.7763 0.7764 0.7792 0.7575 0.7675 0.8011 0.6434 0.6257 0.6376 0.6317
0.4278 37.8947 36000 0.4139 0.6688 0.8015 0.8014 0.8149 0.7962 0.8101 0.7849 0.6663 0.6599 0.6611 0.6880
0.4974 42.1053 40000 0.3921 0.6863 0.8132 0.8132 0.7997 0.8370 0.8165 0.7996 0.7065 0.6548 0.6794 0.7046
0.4364 46.3158 44000 0.3697 0.7023 0.8244 0.8247 0.7941 0.8293 0.8225 0.8520 0.7248 0.6884 0.6982 0.6978
0.3254 50.5263 48000 0.3521 0.7152 0.8340 0.8338 0.8529 0.8268 0.8315 0.8247 0.7016 0.7133 0.7162 0.7295
0.3139 54.7368 52000 0.3471 0.7224 0.8386 0.8386 0.8343 0.8526 0.8305 0.8371 0.7209 0.7032 0.7306 0.7347
0.3209 58.9474 56000 0.3253 0.7359 0.8479 0.8479 0.8565 0.8363 0.8525 0.8463 0.7340 0.7348 0.7294 0.7455
0.2815 63.1579 60000 0.3234 0.7421 0.8516 0.8516 0.8468 0.8707 0.8431 0.8459 0.7466 0.7148 0.7463 0.7608
0.3002 67.3684 64000 0.3132 0.7520 0.8584 0.8584 0.8545 0.8623 0.8503 0.8663 0.7544 0.7411 0.7560 0.7566
0.2874 71.5789 68000 0.3068 0.7571 0.8615 0.8615 0.8582 0.8814 0.8540 0.8524 0.7628 0.7325 0.7639 0.7693
1.0781 75.7895 72000 0.3185 0.7524 0.8588 0.8586 0.8755 0.8528 0.8649 0.8419 0.7446 0.7461 0.7503 0.7686
0.2688 80.0 76000 0.2993 0.7663 0.8676 0.8676 0.8688 0.8677 0.8639 0.8702 0.7693 0.7553 0.7674 0.7730
0.2566 84.2105 80000 0.2962 0.7696 0.8698 0.8698 0.8669 0.8687 0.8673 0.8761 0.7729 0.7574 0.7739 0.7744
0.2556 88.4211 84000 0.2985 0.7754 0.8735 0.8735 0.8735 0.8679 0.8799 0.8726 0.7767 0.7676 0.7734 0.7839
0.2402 92.6316 88000 0.2976 0.7786 0.8755 0.8755 0.8786 0.8751 0.8694 0.8790 0.7766 0.7711 0.7810 0.7858
0.286 96.8421 92000 0.2998 0.7803 0.8766 0.8766 0.8733 0.8750 0.8828 0.8752 0.7833 0.7733 0.7770 0.7876
0.1853 101.0526 96000 0.2987 0.7843 0.8791 0.8791 0.8810 0.8816 0.8813 0.8726 0.7811 0.7754 0.7870 0.7935
0.2401 105.2632 100000 0.3093 0.7819 0.8776 0.8776 0.8761 0.8820 0.8791 0.8732 0.7818 0.7701 0.7864 0.7893
0.2546 109.4737 104000 0.3095 0.7846 0.8793 0.8793 0.8819 0.8829 0.8756 0.8766 0.7850 0.7731 0.7891 0.7912
0.3595 113.6842 108000 0.3096 0.7857 0.8800 0.8800 0.8777 0.8820 0.8795 0.8806 0.7858 0.7766 0.7892 0.7912
0.2484 117.8947 112000 0.3128 0.7857 0.8800 0.8799 0.8782 0.8842 0.8809 0.8765 0.7869 0.7762 0.7880 0.7916

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.1.0
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
34
Safetensors
Model size
47.2M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-30-coord-v3_120epochs

Base model

nvidia/mit-b3
Finetuned
(25)
this model