SodaXII's picture
Model save
dfa928c verified
metadata
library_name: transformers
license: other
base_model: apple/mobilevit-small
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: mobilevit-small_rice-leaf-disease-augmented-v4_fft
    results: []

mobilevit-small_rice-leaf-disease-augmented-v4_fft

This model is a fine-tuned version of apple/mobilevit-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4089
  • Accuracy: 0.9295

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_steps: 256
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Accuracy Validation Loss
2.0561 0.5 64 0.2886 2.0213
1.9819 1.0 128 0.5503 1.8788
1.771 1.5 192 0.6107 1.5291
1.3911 2.0 256 0.7349 1.0706
1.0026 2.5 320 0.8054 0.7560
0.7657 3.0 384 0.8356 0.6180
0.6082 3.5 448 0.8389 0.5422
0.5313 4.0 512 0.8523 0.4946
0.4623 4.5 576 0.8758 0.4512
0.4212 5.0 640 0.8792 0.4322
0.4025 5.5 704 0.8893 0.4259
0.3892 6.0 768 0.8859 0.4238
0.3959 6.5 832 0.8859 0.4083
0.3279 7.0 896 0.8826 0.3750
0.2793 7.5 960 0.8993 0.3350
0.222 8.0 1024 0.8960 0.3208
0.1862 8.5 1088 0.8993 0.3128
0.1717 9.0 1152 0.9027 0.3049
0.1408 9.5 1216 0.9027 0.3010
0.1507 10.0 1280 0.9161 0.3240
0.1369 10.5 1344 0.9060 0.3063
0.1389 11.0 1408 0.9060 0.3045
0.1199 11.5 1472 0.9094 0.3062
0.1003 12.0 1536 0.9128 0.3131
0.0756 12.5 1600 0.9228 0.3002
0.0636 13.0 1664 0.9128 0.3177
0.058 13.5 1728 0.9228 0.3143
0.0566 14.0 1792 0.9195 0.3136
0.0516 14.5 1856 0.9161 0.3447
0.0426 15.0 1920 0.9228 0.2911
0.0513 15.5 1984 0.3028 0.9228
0.0447 16.0 2048 0.3328 0.9195
0.0332 16.5 2112 0.3193 0.9262
0.0358 17.0 2176 0.3385 0.9161
0.0343 17.5 2240 0.3297 0.9295
0.0291 18.0 2304 0.3518 0.9161
0.0287 18.5 2368 0.3224 0.9195
0.0197 19.0 2432 0.3099 0.9228
0.0223 19.5 2496 0.3305 0.9295
0.0282 20.0 2560 0.3378 0.9161
0.0231 20.5 2624 0.3077 0.9228
0.0251 21.0 2688 0.3520 0.9161
0.021 21.5 2752 0.3506 0.9228
0.0222 22.0 2816 0.3561 0.9128
0.016 22.5 2880 0.3482 0.9195
0.0163 23.0 2944 0.3429 0.9228
0.0114 23.5 3008 0.3839 0.9329
0.0106 24.0 3072 0.4066 0.9262
0.0111 24.5 3136 0.4003 0.9329
0.009 25.0 3200 0.4000 0.9262
0.0088 25.5 3264 0.3667 0.9228
0.0057 26.0 3328 0.3587 0.9195
0.0073 26.5 3392 0.3686 0.9329
0.0085 27.0 3456 0.3676 0.9195
0.0087 27.5 3520 0.4251 0.9262
0.0061 28.0 3584 0.3879 0.9195
0.0062 28.5 3648 0.3865 0.9195
0.0068 29.0 3712 0.3943 0.9262
0.0092 29.5 3776 0.4064 0.9228
0.0078 30.0 3840 0.4089 0.9295

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.5.1+cu124
  • Datasets 3.3.2
  • Tokenizers 0.21.1