segformer-b4-random-init-morphpadver1-hgo-coord-v9_mix_resample_40epochs
This model is a fine-tuned version of random_init on the NICOPOI-9/morphpad_coord_hgo_512_4class_v2 dataset. It achieves the following results on the evaluation set:
- Loss: 1.3834
- Mean Iou: 0.1030
- Mean Accuracy: 0.2500
- Overall Accuracy: 0.2754
- Accuracy 0-0: 0.0
- Accuracy 0-90: 0.7090
- Accuracy 90-0: 0.2908
- Accuracy 90-90: 0.0
- Iou 0-0: 0.0
- Iou 0-90: 0.2501
- Iou 90-0: 0.1618
- Iou 90-90: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.3905 | 1.3638 | 4000 | 1.3824 | 0.1021 | 0.2504 | 0.2759 | 0.0006 | 0.7381 | 0.2614 | 0.0013 | 0.0006 | 0.2519 | 0.1546 | 0.0013 |
1.3909 | 2.7276 | 8000 | 1.3850 | 0.0902 | 0.2517 | 0.2764 | 0.0201 | 0.9055 | 0.0549 | 0.0264 | 0.0189 | 0.2694 | 0.0479 | 0.0247 |
1.3898 | 4.0914 | 12000 | 1.3876 | 0.1004 | 0.2495 | 0.2473 | 0.0002 | 0.0077 | 0.4529 | 0.5370 | 0.0002 | 0.0076 | 0.2046 | 0.1892 |
1.3848 | 5.4552 | 16000 | 1.3852 | 0.0762 | 0.2499 | 0.2758 | 0.0201 | 0.9673 | 0.0113 | 0.0007 | 0.0188 | 0.2745 | 0.0110 | 0.0007 |
1.3912 | 6.8190 | 20000 | 1.3829 | 0.1076 | 0.2500 | 0.2741 | 0.0 | 0.4934 | 0.5068 | 0.0 | 0.0 | 0.2165 | 0.2139 | 0.0 |
1.3394 | 8.1827 | 24000 | 1.3831 | 0.0698 | 0.2501 | 0.2773 | 0.0 | 0.9983 | 0.0022 | 0.0 | 0.0 | 0.2771 | 0.0022 | 0.0 |
1.4205 | 9.5465 | 28000 | 1.3845 | 0.1274 | 0.2499 | 0.2605 | 0.0 | 0.3064 | 0.4085 | 0.2845 | 0.0 | 0.1682 | 0.1966 | 0.1448 |
1.382 | 10.9103 | 32000 | 1.3839 | 0.0861 | 0.2487 | 0.2665 | 0.0713 | 0.0273 | 0.8963 | 0.0 | 0.0569 | 0.0254 | 0.2619 | 0.0 |
1.3715 | 12.2741 | 36000 | 1.3869 | 0.0884 | 0.2498 | 0.2762 | 0.0 | 0.8874 | 0.1116 | 0.0 | 0.0 | 0.2689 | 0.0847 | 0.0 |
1.3879 | 13.6379 | 40000 | 1.3818 | 0.0802 | 0.2526 | 0.2743 | 0.0 | 0.0602 | 0.9501 | 0.0 | 0.0 | 0.0539 | 0.2671 | 0.0 |
1.3847 | 15.0017 | 44000 | 1.3850 | 0.0678 | 0.25 | 0.2711 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.2711 | 0.0 |
1.3943 | 16.3655 | 48000 | 1.3833 | 0.0739 | 0.2497 | 0.2693 | 0.0302 | 0.0001 | 0.9684 | 0.0 | 0.0273 | 0.0001 | 0.2684 | 0.0 |
1.3877 | 17.7293 | 52000 | 1.3842 | 0.1075 | 0.2511 | 0.2749 | 0.0 | 0.4278 | 0.5767 | 0.0 | 0.0 | 0.2042 | 0.2257 | 0.0 |
1.3813 | 19.0931 | 56000 | 1.3833 | 0.1195 | 0.2532 | 0.2746 | 0.0 | 0.4714 | 0.4770 | 0.0643 | 0.0 | 0.2154 | 0.2104 | 0.0523 |
1.3934 | 20.4569 | 60000 | 1.3844 | 0.0700 | 0.2501 | 0.2712 | 0.0 | 0.0098 | 0.9904 | 0.0000 | 0.0 | 0.0096 | 0.2704 | 0.0000 |
1.3841 | 21.8207 | 64000 | 1.3828 | 0.0851 | 0.2524 | 0.2794 | 0.0 | 0.9286 | 0.0811 | 0.0 | 0.0 | 0.2730 | 0.0674 | 0.0 |
1.3697 | 23.1845 | 68000 | 1.3835 | 0.0939 | 0.2484 | 0.2743 | 0.0 | 0.8252 | 0.1682 | 0.0 | 0.0 | 0.2612 | 0.1145 | 0.0 |
1.3951 | 24.5482 | 72000 | 1.3836 | 0.1060 | 0.2484 | 0.2719 | 0.0 | 0.4205 | 0.5731 | 0.0 | 0.0 | 0.1985 | 0.2255 | 0.0 |
1.3842 | 25.9120 | 76000 | 1.3828 | 0.0700 | 0.2500 | 0.2771 | 0.0 | 0.9966 | 0.0030 | 0.0003 | 0.0 | 0.2769 | 0.0030 | 0.0003 |
1.3952 | 27.2758 | 80000 | 1.3825 | 0.1020 | 0.2456 | 0.2683 | 0.0 | 0.3268 | 0.6555 | 0.0 | 0.0 | 0.1705 | 0.2374 | 0.0 |
1.4017 | 28.6396 | 84000 | 1.3823 | 0.0912 | 0.2555 | 0.2826 | 0.0 | 0.9046 | 0.1175 | 0.0 | 0.0 | 0.2731 | 0.0917 | 0.0 |
1.3873 | 30.0034 | 88000 | 1.3823 | 0.0742 | 0.2487 | 0.2756 | 0.0 | 0.9701 | 0.0248 | 0.0 | 0.0 | 0.2738 | 0.0232 | 0.0 |
1.4049 | 31.3672 | 92000 | 1.3834 | 0.0890 | 0.2503 | 0.2722 | 0.0 | 0.1255 | 0.8756 | 0.0 | 0.0 | 0.0941 | 0.2619 | 0.0 |
1.3727 | 32.7310 | 96000 | 1.3829 | 0.0721 | 0.2498 | 0.2769 | 0.0 | 0.9865 | 0.0128 | 0.0 | 0.0 | 0.2759 | 0.0123 | 0.0 |
1.3856 | 34.0948 | 100000 | 1.3836 | 0.0748 | 0.2504 | 0.2775 | 0.0000 | 0.9762 | 0.0255 | 0.0000 | 0.0000 | 0.2753 | 0.0240 | 0.0000 |
1.3706 | 35.4586 | 104000 | 1.3829 | 0.0915 | 0.2524 | 0.2791 | 0.0 | 0.8800 | 0.1297 | 0.0 | 0.0 | 0.2689 | 0.0971 | 0.0 |
1.3861 | 36.8224 | 108000 | 1.3833 | 0.1078 | 0.2511 | 0.2758 | 0.0 | 0.5713 | 0.4331 | 0.0 | 0.0 | 0.2302 | 0.2008 | 0.0 |
1.3839 | 38.1862 | 112000 | 1.3831 | 0.0812 | 0.2508 | 0.2776 | 0.0 | 0.9429 | 0.0601 | 0.0 | 0.0 | 0.2729 | 0.0520 | 0.0 |
1.3839 | 39.5499 | 116000 | 1.3834 | 0.1030 | 0.2500 | 0.2754 | 0.0 | 0.7090 | 0.2908 | 0.0 | 0.0 | 0.2501 | 0.1618 | 0.0 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 9
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support