mobileViTV2-128-2
This model is a fine-tuned version of apple/mobilevitv2-1.0-imagenet1k-256 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1494
- Accuracy: 0.9480
- F1: 0.9484
- Precision: 0.9498
- Recall: 0.9480
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
1.628 | 1.0 | 93 | 1.6205 | 0.1697 | 0.1607 | 0.1720 | 0.1697 |
1.6155 | 2.0 | 186 | 1.6120 | 0.2242 | 0.2119 | 0.2293 | 0.2242 |
1.6031 | 3.0 | 279 | 1.5941 | 0.2667 | 0.2650 | 0.2872 | 0.2667 |
1.5639 | 4.0 | 372 | 1.5756 | 0.3091 | 0.3022 | 0.3163 | 0.3091 |
1.525 | 5.0 | 465 | 1.5384 | 0.4 | 0.3949 | 0.4097 | 0.4 |
1.4892 | 6.0 | 558 | 1.4831 | 0.5091 | 0.4924 | 0.4945 | 0.5091 |
1.3605 | 7.0 | 651 | 1.3807 | 0.6 | 0.5801 | 0.5833 | 0.6 |
1.1423 | 8.0 | 744 | 1.2166 | 0.6242 | 0.5967 | 0.6139 | 0.6242 |
1.0801 | 9.0 | 837 | 1.0558 | 0.7030 | 0.6663 | 0.7190 | 0.7030 |
0.8504 | 10.0 | 930 | 0.8856 | 0.7697 | 0.7487 | 0.8081 | 0.7697 |
0.7659 | 11.0 | 1023 | 0.7253 | 0.8121 | 0.7954 | 0.8362 | 0.8121 |
0.547 | 12.0 | 1116 | 0.5812 | 0.8485 | 0.8418 | 0.8581 | 0.8485 |
0.5522 | 13.0 | 1209 | 0.4633 | 0.8970 | 0.8926 | 0.9078 | 0.8970 |
0.3583 | 14.0 | 1302 | 0.3797 | 0.9152 | 0.9119 | 0.9257 | 0.9152 |
0.3421 | 15.0 | 1395 | 0.3431 | 0.9273 | 0.9263 | 0.9379 | 0.9273 |
0.3624 | 16.0 | 1488 | 0.3010 | 0.9273 | 0.9265 | 0.9389 | 0.9273 |
0.2069 | 17.0 | 1581 | 0.2989 | 0.9152 | 0.9146 | 0.9260 | 0.9152 |
0.1639 | 18.0 | 1674 | 0.2797 | 0.9212 | 0.9205 | 0.9300 | 0.9212 |
0.2428 | 19.0 | 1767 | 0.2815 | 0.9273 | 0.9263 | 0.9379 | 0.9273 |
0.304 | 20.0 | 1860 | 0.2587 | 0.9333 | 0.9325 | 0.9432 | 0.9333 |
0.1349 | 21.0 | 1953 | 0.2617 | 0.9273 | 0.9266 | 0.9378 | 0.9273 |
0.2299 | 22.0 | 2046 | 0.2552 | 0.9333 | 0.9325 | 0.9432 | 0.9333 |
0.0894 | 23.0 | 2139 | 0.2560 | 0.9152 | 0.9149 | 0.9222 | 0.9152 |
0.1049 | 24.0 | 2232 | 0.2689 | 0.9152 | 0.9146 | 0.9226 | 0.9152 |
0.1201 | 25.0 | 2325 | 0.2921 | 0.9152 | 0.9144 | 0.9223 | 0.9152 |
0.1162 | 26.0 | 2418 | 0.3317 | 0.9212 | 0.9206 | 0.9283 | 0.9212 |
0.049 | 27.0 | 2511 | 0.2916 | 0.9273 | 0.9266 | 0.9346 | 0.9273 |
0.107 | 28.0 | 2604 | 0.2921 | 0.9273 | 0.9266 | 0.9346 | 0.9273 |
0.0521 | 29.0 | 2697 | 0.3267 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.1911 | 30.0 | 2790 | 0.3661 | 0.9091 | 0.9089 | 0.9147 | 0.9091 |
0.1636 | 31.0 | 2883 | 0.3444 | 0.9152 | 0.9147 | 0.9201 | 0.9152 |
0.0615 | 32.0 | 2976 | 0.3879 | 0.9212 | 0.9208 | 0.9277 | 0.9212 |
0.0581 | 33.0 | 3069 | 0.3606 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.1042 | 34.0 | 3162 | 0.3910 | 0.9333 | 0.9327 | 0.9404 | 0.9333 |
0.1468 | 35.0 | 3255 | 0.4503 | 0.9273 | 0.9266 | 0.9346 | 0.9273 |
0.0303 | 36.0 | 3348 | 0.4035 | 0.9152 | 0.9146 | 0.9207 | 0.9152 |
0.0512 | 37.0 | 3441 | 0.4157 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.0627 | 38.0 | 3534 | 0.4399 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.091 | 39.0 | 3627 | 0.4023 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.1877 | 40.0 | 3720 | 0.4463 | 0.9152 | 0.9148 | 0.9187 | 0.9152 |
0.072 | 41.0 | 3813 | 0.4729 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.0611 | 42.0 | 3906 | 0.5000 | 0.9152 | 0.9147 | 0.9201 | 0.9152 |
0.0308 | 43.0 | 3999 | 0.5051 | 0.9091 | 0.9087 | 0.9130 | 0.9091 |
0.0801 | 44.0 | 4092 | 0.5044 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0548 | 45.0 | 4185 | 0.5312 | 0.9212 | 0.9211 | 0.9259 | 0.9212 |
0.054 | 46.0 | 4278 | 0.5439 | 0.8970 | 0.8965 | 0.9031 | 0.8970 |
0.036 | 47.0 | 4371 | 0.5276 | 0.8970 | 0.8963 | 0.9027 | 0.8970 |
0.0172 | 48.0 | 4464 | 0.5379 | 0.8970 | 0.8966 | 0.9003 | 0.8970 |
0.0573 | 49.0 | 4557 | 0.5380 | 0.9152 | 0.9146 | 0.9207 | 0.9152 |
0.0593 | 50.0 | 4650 | 0.5323 | 0.9091 | 0.9089 | 0.9113 | 0.9091 |
0.073 | 51.0 | 4743 | 0.5931 | 0.9030 | 0.9029 | 0.9110 | 0.9030 |
0.0959 | 52.0 | 4836 | 0.5285 | 0.9152 | 0.9148 | 0.9187 | 0.9152 |
0.0251 | 53.0 | 4929 | 0.5081 | 0.9152 | 0.9150 | 0.9173 | 0.9152 |
0.0129 | 54.0 | 5022 | 0.5469 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.0073 | 55.0 | 5115 | 0.5533 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0922 | 56.0 | 5208 | 0.5499 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0468 | 57.0 | 5301 | 0.5510 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0217 | 58.0 | 5394 | 0.5798 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0949 | 59.0 | 5487 | 0.5748 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0569 | 60.0 | 5580 | 0.5744 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0187 | 61.0 | 5673 | 0.5989 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0333 | 62.0 | 5766 | 0.6353 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.002 | 63.0 | 5859 | 0.6033 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0169 | 64.0 | 5952 | 0.6128 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0303 | 65.0 | 6045 | 0.6143 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.0451 | 66.0 | 6138 | 0.6139 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0291 | 67.0 | 6231 | 0.6058 | 0.9152 | 0.9148 | 0.9187 | 0.9152 |
0.0163 | 68.0 | 6324 | 0.6154 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.0181 | 69.0 | 6417 | 0.5810 | 0.9091 | 0.9089 | 0.9114 | 0.9091 |
0.0441 | 70.0 | 6510 | 0.6019 | 0.9152 | 0.9147 | 0.9203 | 0.9152 |
0.0395 | 71.0 | 6603 | 0.6018 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.0229 | 72.0 | 6696 | 0.6280 | 0.9091 | 0.9084 | 0.9152 | 0.9091 |
0.0509 | 73.0 | 6789 | 0.6442 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0178 | 74.0 | 6882 | 0.6510 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0048 | 75.0 | 6975 | 0.6086 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0207 | 76.0 | 7068 | 0.6676 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.0354 | 77.0 | 7161 | 0.6055 | 0.9152 | 0.9148 | 0.9187 | 0.9152 |
0.0233 | 78.0 | 7254 | 0.6043 | 0.9152 | 0.9150 | 0.9173 | 0.9152 |
0.0522 | 79.0 | 7347 | 0.6388 | 0.9152 | 0.9148 | 0.9187 | 0.9152 |
0.0519 | 80.0 | 7440 | 0.6531 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0129 | 81.0 | 7533 | 0.6346 | 0.9212 | 0.9209 | 0.9246 | 0.9212 |
0.0092 | 82.0 | 7626 | 0.6650 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0289 | 83.0 | 7719 | 0.6390 | 0.9091 | 0.9089 | 0.9114 | 0.9091 |
0.0561 | 84.0 | 7812 | 0.6260 | 0.9091 | 0.9090 | 0.9103 | 0.9091 |
0.063 | 85.0 | 7905 | 0.6484 | 0.9212 | 0.9207 | 0.9263 | 0.9212 |
0.0372 | 86.0 | 7998 | 0.6375 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0042 | 87.0 | 8091 | 0.6384 | 0.9152 | 0.9150 | 0.9173 | 0.9152 |
0.0659 | 88.0 | 8184 | 0.6734 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.016 | 89.0 | 8277 | 0.6275 | 0.9091 | 0.9089 | 0.9114 | 0.9091 |
0.0567 | 90.0 | 8370 | 0.6611 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.0467 | 91.0 | 8463 | 0.6528 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0337 | 92.0 | 8556 | 0.6726 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0159 | 93.0 | 8649 | 0.6528 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.1206 | 94.0 | 8742 | 0.6997 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.051 | 95.0 | 8835 | 0.6729 | 0.9152 | 0.9150 | 0.9173 | 0.9152 |
0.0459 | 96.0 | 8928 | 0.6691 | 0.9212 | 0.9209 | 0.9246 | 0.9212 |
0.0338 | 97.0 | 9021 | 0.6332 | 0.9212 | 0.9207 | 0.9264 | 0.9212 |
0.0823 | 98.0 | 9114 | 0.6550 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
0.0259 | 99.0 | 9207 | 0.6553 | 0.9212 | 0.9209 | 0.9246 | 0.9212 |
0.2724 | 100.0 | 9300 | 0.6473 | 0.9273 | 0.9268 | 0.9323 | 0.9273 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 15
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Master-Rapha7/mobileViTV2-128-2
Base model
apple/mobilevitv2-1.0-imagenet1k-256