sagittal-b0-finetuned-segments
This model is a fine-tuned version of nvidia/mit-b0 on the jenniferlumeng/Sagittal dataset. It achieves the following results on the evaluation set:
- Loss: 1.0561
- Mean Iou: 0.3928
- Mean Accuracy: 0.5550
- Overall Accuracy: 0.6001
- Accuracy Background: nan
- Accuracy Olfactory bulb: 0.7548
- Accuracy Anterior olfactory nucleus: 0.2361
- Accuracy Basal ganglia: 0.5670
- Accuracy Cortex: 0.8443
- Accuracy Hypothalamus: 0.3500
- Accuracy Thalamus: 0.3216
- Accuracy Hippocampus: 0.4568
- Accuracy Midbrain: 0.7339
- Accuracy Cerebellum: 0.8112
- Accuracy Pons and medulla: 0.4748
- Iou Background: 0.0
- Iou Olfactory bulb: 0.5861
- Iou Anterior olfactory nucleus: 0.2110
- Iou Basal ganglia: 0.4574
- Iou Cortex: 0.6560
- Iou Hypothalamus: 0.3196
- Iou Thalamus: 0.3020
- Iou Hippocampus: 0.4364
- Iou Midbrain: 0.2970
- Iou Cerebellum: 0.6248
- Iou Pons and medulla: 0.4303
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Olfactory bulb | Accuracy Anterior olfactory nucleus | Accuracy Basal ganglia | Accuracy Cortex | Accuracy Hypothalamus | Accuracy Thalamus | Accuracy Hippocampus | Accuracy Midbrain | Accuracy Cerebellum | Accuracy Pons and medulla | Iou Background | Iou Olfactory bulb | Iou Anterior olfactory nucleus | Iou Basal ganglia | Iou Cortex | Iou Hypothalamus | Iou Thalamus | Iou Hippocampus | Iou Midbrain | Iou Cerebellum | Iou Pons and medulla |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.9251 | 3.3333 | 20 | 2.3067 | 0.0795 | 0.1796 | 0.2634 | nan | 0.2399 | 0.0 | 0.0 | 0.9926 | 0.0554 | 0.0 | 0.0957 | 0.0 | 0.3004 | 0.1122 | 0.0 | 0.1546 | 0.0 | 0.0 | 0.2430 | 0.0548 | 0.0 | 0.0692 | 0.0 | 0.2769 | 0.0760 |
1.5773 | 6.6667 | 40 | 1.8563 | 0.1695 | 0.2863 | 0.3584 | nan | 0.4130 | 0.0 | 0.0340 | 0.8260 | 0.2004 | 0.1704 | 0.1409 | 0.4293 | 0.4427 | 0.2064 | 0.0 | 0.2795 | 0.0 | 0.0326 | 0.5150 | 0.1849 | 0.1148 | 0.1249 | 0.1197 | 0.3308 | 0.1619 |
1.3176 | 10.0 | 60 | 1.6904 | 0.2571 | 0.4066 | 0.4408 | nan | 0.6887 | 0.0 | 0.8044 | 0.7641 | 0.2494 | 0.1046 | 0.3613 | 0.2678 | 0.5775 | 0.2483 | 0.0 | 0.4874 | 0.0 | 0.1599 | 0.6414 | 0.2309 | 0.0979 | 0.3281 | 0.2014 | 0.4568 | 0.2239 |
1.3147 | 13.3333 | 80 | 1.4640 | 0.2817 | 0.4272 | 0.4773 | nan | 0.5533 | 0.0 | 0.4670 | 0.7442 | 0.2610 | 0.0613 | 0.4597 | 0.6721 | 0.6021 | 0.4509 | 0.0 | 0.3820 | 0.0 | 0.3403 | 0.6051 | 0.2377 | 0.0604 | 0.4191 | 0.2155 | 0.4920 | 0.3467 |
1.1284 | 16.6667 | 100 | 1.3582 | 0.2754 | 0.4165 | 0.4699 | nan | 0.6502 | 0.0036 | 0.5064 | 0.8678 | 0.2266 | 0.1352 | 0.4677 | 0.2578 | 0.5931 | 0.4562 | 0.0 | 0.5197 | 0.0036 | 0.3552 | 0.3330 | 0.2138 | 0.1324 | 0.4201 | 0.2191 | 0.4907 | 0.3414 |
1.1223 | 20.0 | 120 | 1.2891 | 0.2862 | 0.4221 | 0.4689 | nan | 0.6084 | 0.1211 | 0.3576 | 0.8239 | 0.2694 | 0.1435 | 0.4589 | 0.4033 | 0.5980 | 0.4369 | 0.0 | 0.4424 | 0.1137 | 0.3147 | 0.3247 | 0.2529 | 0.1306 | 0.4291 | 0.3035 | 0.4722 | 0.3644 |
1.0746 | 23.3333 | 140 | 1.2653 | 0.3122 | 0.4564 | 0.5064 | nan | 0.5209 | 0.0778 | 0.5140 | 0.7894 | 0.3297 | 0.1814 | 0.4610 | 0.7193 | 0.5320 | 0.4385 | 0.0 | 0.4051 | 0.0749 | 0.4002 | 0.5456 | 0.2988 | 0.1715 | 0.4247 | 0.2543 | 0.4697 | 0.3895 |
1.1374 | 26.6667 | 160 | 1.2168 | 0.3412 | 0.5006 | 0.5445 | nan | 0.7152 | 0.1454 | 0.5457 | 0.8318 | 0.3455 | 0.2429 | 0.4675 | 0.6257 | 0.6441 | 0.4424 | 0.0 | 0.5520 | 0.1348 | 0.3895 | 0.5159 | 0.3081 | 0.2310 | 0.4254 | 0.2683 | 0.5242 | 0.4046 |
0.8899 | 30.0 | 180 | 1.1327 | 0.3517 | 0.4974 | 0.5497 | nan | 0.5466 | 0.1382 | 0.4970 | 0.7591 | 0.3350 | 0.2808 | 0.4335 | 0.7737 | 0.7053 | 0.5047 | 0.0 | 0.4196 | 0.1286 | 0.4157 | 0.6405 | 0.3087 | 0.2560 | 0.4186 | 0.2777 | 0.5777 | 0.4251 |
0.822 | 33.3333 | 200 | 1.1223 | 0.3765 | 0.5371 | 0.5878 | nan | 0.7811 | 0.1923 | 0.5470 | 0.8318 | 0.3197 | 0.2845 | 0.4424 | 0.7182 | 0.6918 | 0.5623 | 0.0 | 0.5922 | 0.1736 | 0.4012 | 0.6352 | 0.3021 | 0.2676 | 0.4264 | 0.2960 | 0.5843 | 0.4632 |
1.3568 | 36.6667 | 220 | 1.0941 | 0.4136 | 0.5818 | 0.6319 | nan | 0.8136 | 0.2270 | 0.5548 | 0.8444 | 0.3559 | 0.7737 | 0.4477 | 0.4848 | 0.7838 | 0.5318 | 0.0 | 0.6210 | 0.2027 | 0.4504 | 0.6852 | 0.3245 | 0.4370 | 0.4324 | 0.2808 | 0.6397 | 0.4756 |
0.9664 | 40.0 | 240 | 1.0685 | 0.3843 | 0.5490 | 0.5962 | nan | 0.6310 | 0.2199 | 0.5723 | 0.8041 | 0.3211 | 0.8415 | 0.4591 | 0.3971 | 0.7908 | 0.4532 | 0.0 | 0.4706 | 0.1978 | 0.4563 | 0.6778 | 0.3002 | 0.4135 | 0.4364 | 0.2563 | 0.6112 | 0.4073 |
0.824 | 43.3333 | 260 | 1.0399 | 0.3955 | 0.5580 | 0.6051 | nan | 0.7486 | 0.2461 | 0.5592 | 0.8658 | 0.3265 | 0.5614 | 0.4605 | 0.5438 | 0.8004 | 0.4678 | 0.0 | 0.5838 | 0.2191 | 0.4581 | 0.6516 | 0.3069 | 0.3773 | 0.4371 | 0.2778 | 0.6095 | 0.4297 |
1.126 | 46.6667 | 280 | 1.0414 | 0.4036 | 0.5741 | 0.6232 | nan | 0.7521 | 0.2361 | 0.5644 | 0.8473 | 0.3231 | 0.8344 | 0.4528 | 0.4500 | 0.8079 | 0.4731 | 0.0 | 0.5863 | 0.2112 | 0.4592 | 0.6156 | 0.3047 | 0.4545 | 0.4341 | 0.3238 | 0.6222 | 0.4280 |
0.8935 | 50.0 | 300 | 1.0561 | 0.3928 | 0.5550 | 0.6001 | nan | 0.7548 | 0.2361 | 0.5670 | 0.8443 | 0.3500 | 0.3216 | 0.4568 | 0.7339 | 0.8112 | 0.4748 | 0.0 | 0.5861 | 0.2110 | 0.4574 | 0.6560 | 0.3196 | 0.3020 | 0.4364 | 0.2970 | 0.6248 | 0.4303 |
Framework versions
- Transformers 4.52.2
- Pytorch 2.6.0+cu124
- Datasets 2.16.1
- Tokenizers 0.21.1
- Downloads last month
- 33
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for jenniferlumeng/sagittal-b0-finetuned-segments
Base model
nvidia/mit-b0