mobileViTV2-64

This model is a fine-tuned version of apple/mobilevitv2-1.0-imagenet1k-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3307
  • Accuracy: 0.9106
  • F1: 0.9093
  • Precision: 0.9118
  • Recall: 0.9106

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
1.6065 1.0 364 1.6103 0.1860 0.1866 0.1893 0.1860
1.5721 2.0 728 1.5804 0.2851 0.2857 0.2889 0.2851
1.4793 3.0 1092 1.5111 0.4229 0.4057 0.4006 0.4229
1.316 4.0 1456 1.2841 0.5028 0.4652 0.5157 0.5028
1.1397 5.0 1820 1.0520 0.5909 0.5474 0.6502 0.5909
0.8639 6.0 2184 0.8194 0.7163 0.6970 0.7325 0.7163
0.7371 7.0 2548 0.6773 0.7796 0.7711 0.7927 0.7796
0.6451 8.0 2912 0.5546 0.8292 0.8258 0.8334 0.8292
0.5299 9.0 3276 0.4800 0.8485 0.8461 0.8477 0.8485
0.387 10.0 3640 0.4091 0.8760 0.8737 0.8759 0.8760
0.3903 11.0 4004 0.3547 0.8884 0.8867 0.8887 0.8884
0.3513 12.0 4368 0.3207 0.9008 0.8983 0.9038 0.9008
0.3145 13.0 4732 0.3213 0.8967 0.8939 0.9009 0.8967
0.1838 14.0 5096 0.3013 0.8939 0.8924 0.8941 0.8939
0.3438 15.0 5460 0.3229 0.8857 0.8843 0.8850 0.8857
0.1913 16.0 5824 0.2568 0.9174 0.9159 0.9191 0.9174
0.2078 17.0 6188 0.2609 0.9187 0.9169 0.9206 0.9187
0.2061 18.0 6552 0.2811 0.9077 0.9061 0.9076 0.9077
0.2806 19.0 6916 0.2536 0.9242 0.9230 0.9260 0.9242
0.2495 20.0 7280 0.2881 0.9091 0.9076 0.9094 0.9091
0.0361 21.0 7644 0.2875 0.9311 0.9301 0.9331 0.9311
0.1811 22.0 8008 0.3067 0.9063 0.9050 0.9056 0.9063
0.1129 23.0 8372 0.2996 0.9050 0.9047 0.9053 0.9050
0.1138 24.0 8736 0.2970 0.9063 0.9060 0.9066 0.9063
0.3135 25.0 9100 0.3723 0.8967 0.8968 0.8972 0.8967
0.0828 26.0 9464 0.3574 0.9063 0.9060 0.9059 0.9063
0.0783 27.0 9828 0.4087 0.8939 0.8926 0.8926 0.8939
0.051 28.0 10192 0.3713 0.9063 0.9060 0.9068 0.9063
0.0744 29.0 10556 0.4470 0.8953 0.8951 0.8958 0.8953
0.0814 30.0 10920 0.4289 0.9077 0.9085 0.9099 0.9077
0.131 31.0 11284 0.4600 0.9008 0.8996 0.8997 0.9008
0.0245 32.0 11648 0.4818 0.8981 0.8978 0.8977 0.8981
0.0541 33.0 12012 0.4678 0.9050 0.9043 0.9040 0.9050
0.1011 34.0 12376 0.5298 0.8994 0.8985 0.8991 0.8994
0.17 35.0 12740 0.5093 0.9036 0.9026 0.9024 0.9036
0.0892 36.0 13104 0.5018 0.9063 0.9050 0.9050 0.9063
0.0246 37.0 13468 0.5520 0.9077 0.9058 0.9061 0.9077
0.0564 38.0 13832 0.5493 0.9077 0.9075 0.9077 0.9077
0.0817 39.0 14196 0.5607 0.9091 0.9084 0.9081 0.9091
0.0056 40.0 14560 0.5990 0.8939 0.8947 0.8961 0.8939
0.0653 41.0 14924 0.5870 0.9146 0.9136 0.9136 0.9146
0.1649 42.0 15288 0.5882 0.9050 0.9040 0.9039 0.9050
0.1057 43.0 15652 0.5924 0.9008 0.8999 0.9002 0.9008
0.0859 44.0 16016 0.5830 0.8994 0.8994 0.8999 0.8994
0.1809 45.0 16380 0.6357 0.8953 0.8939 0.8936 0.8953
0.1285 46.0 16744 0.6617 0.8967 0.8965 0.8975 0.8967
0.1018 47.0 17108 0.6006 0.9050 0.9044 0.9042 0.9050
0.0091 48.0 17472 0.5762 0.9091 0.9090 0.9094 0.9091
0.0368 49.0 17836 0.6097 0.9077 0.9067 0.9071 0.9077
0.0585 50.0 18200 0.6059 0.9063 0.9059 0.9061 0.9063
0.0373 51.0 18564 0.6621 0.8953 0.8953 0.8963 0.8953
0.1672 52.0 18928 0.6081 0.9022 0.9020 0.9019 0.9022
0.0344 53.0 19292 0.6145 0.8994 0.9002 0.9011 0.8994
0.0727 54.0 19656 0.6106 0.9036 0.9034 0.9034 0.9036
0.1997 55.0 20020 0.6037 0.9091 0.9082 0.9090 0.9091
0.0437 56.0 20384 0.5835 0.9105 0.9100 0.9105 0.9105
0.0263 57.0 20748 0.6032 0.9063 0.9062 0.9064 0.9063
0.056 58.0 21112 0.5828 0.9105 0.9101 0.9102 0.9105
0.0422 59.0 21476 0.6179 0.9105 0.9111 0.9129 0.9105
0.0377 60.0 21840 0.6400 0.8981 0.8997 0.9027 0.8981
0.1162 61.0 22204 0.5841 0.9105 0.9106 0.9108 0.9105
0.0407 62.0 22568 0.6017 0.9063 0.9064 0.9067 0.9063
0.0443 63.0 22932 0.6064 0.9036 0.9031 0.9029 0.9036
0.089 64.0 23296 0.6250 0.9008 0.9011 0.9018 0.9008
0.0971 65.0 23660 0.6729 0.9022 0.9011 0.9018 0.9022
0.046 66.0 24024 0.6445 0.9063 0.9060 0.9062 0.9063
0.0387 67.0 24388 0.6070 0.9036 0.9039 0.9046 0.9036
0.0709 68.0 24752 0.5890 0.9132 0.9131 0.9132 0.9132
0.0273 69.0 25116 0.6484 0.9008 0.9001 0.9007 0.9008
0.1951 70.0 25480 0.6336 0.9077 0.9075 0.9075 0.9077
0.0569 71.0 25844 0.6546 0.9105 0.9104 0.9105 0.9105
0.1145 72.0 26208 0.6964 0.9036 0.9026 0.9027 0.9036
0.0352 73.0 26572 0.6657 0.9118 0.9114 0.9115 0.9118
0.0375 74.0 26936 0.6417 0.9050 0.9054 0.9059 0.9050
0.0351 75.0 27300 0.6812 0.9091 0.9077 0.9081 0.9091
0.0675 76.0 27664 0.6445 0.9105 0.9100 0.9103 0.9105
0.0418 77.0 28028 0.7359 0.9091 0.9073 0.9084 0.9091
0.0353 78.0 28392 0.6701 0.9022 0.9021 0.9022 0.9022
0.0221 79.0 28756 0.6607 0.9036 0.9039 0.9051 0.9036
0.2435 80.0 29120 0.6487 0.9118 0.9115 0.9114 0.9118
0.0362 81.0 29484 0.7711 0.9077 0.9060 0.9064 0.9077
0.0116 82.0 29848 0.6276 0.9063 0.9067 0.9073 0.9063
0.001 83.0 30212 0.6564 0.9022 0.9020 0.9022 0.9022
0.013 84.0 30576 0.6576 0.9077 0.9071 0.9072 0.9077
0.0183 85.0 30940 0.7075 0.9036 0.9033 0.9038 0.9036
0.0367 86.0 31304 0.7168 0.9118 0.9100 0.9108 0.9118
0.027 87.0 31668 0.6892 0.9132 0.9133 0.9143 0.9132
0.015 88.0 32032 0.6886 0.9077 0.9069 0.9069 0.9077
0.0435 89.0 32396 0.6863 0.9008 0.9009 0.9012 0.9008
0.0049 90.0 32760 0.6883 0.9077 0.9072 0.9069 0.9077
0.1041 91.0 33124 0.7216 0.9008 0.9000 0.9002 0.9008
0.0465 92.0 33488 0.7032 0.9022 0.9021 0.9026 0.9022
0.0221 93.0 33852 0.7131 0.9036 0.9025 0.9023 0.9036
0.0091 94.0 34216 0.6886 0.8953 0.8963 0.8976 0.8953
0.0322 95.0 34580 0.7213 0.9022 0.9020 0.9024 0.9022
0.0348 96.0 34944 0.7005 0.9022 0.9016 0.9014 0.9022
0.0357 97.0 35308 0.7131 0.8967 0.8971 0.8980 0.8967
0.0363 98.0 35672 0.6947 0.9118 0.9112 0.9114 0.9118
0.0249 99.0 36036 0.6783 0.9132 0.9126 0.9126 0.9132
0.0179 100.0 36400 0.6614 0.9036 0.9037 0.9039 0.9036

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.5.1+cu124
  • Datasets 3.4.0
  • Tokenizers 0.21.0
Downloads last month
3
Safetensors
Model size
4.41M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Master-Rapha7/mobileViTV2-64

Finetuned
(10)
this model