Edit model card

hushem_1x_beit_base_adamax_001_fold3

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8097
  • Accuracy: 0.5349

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.4381 0.2326
2.0527 2.0 12 1.4022 0.2558
2.0527 3.0 18 1.3682 0.3256
1.3782 4.0 24 1.3387 0.3953
1.2679 5.0 30 1.3721 0.3256
1.2679 6.0 36 1.7451 0.3488
1.2756 7.0 42 1.3183 0.3953
1.2756 8.0 48 1.4225 0.3023
1.173 9.0 54 1.4215 0.3953
1.1959 10.0 60 1.4072 0.3721
1.1959 11.0 66 1.4852 0.4186
1.1344 12.0 72 1.4523 0.2791
1.1344 13.0 78 1.4043 0.4651
1.0854 14.0 84 1.3638 0.3953
1.1124 15.0 90 1.4323 0.3953
1.1124 16.0 96 1.4664 0.4884
1.0108 17.0 102 1.5473 0.3721
1.0108 18.0 108 1.2300 0.4651
0.9443 19.0 114 1.2523 0.4419
0.9125 20.0 120 1.4134 0.3721
0.9125 21.0 126 1.1280 0.4884
0.8328 22.0 132 1.1054 0.4884
0.8328 23.0 138 1.6081 0.4419
0.7565 24.0 144 1.0331 0.5349
0.7135 25.0 150 1.6384 0.5116
0.7135 26.0 156 1.9524 0.4651
0.7048 27.0 162 1.1399 0.5349
0.7048 28.0 168 1.0504 0.5581
0.7074 29.0 174 1.0452 0.5581
0.7008 30.0 180 1.4757 0.5581
0.7008 31.0 186 1.0663 0.4419
0.5976 32.0 192 1.0991 0.5349
0.5976 33.0 198 1.5330 0.5814
0.5565 34.0 204 1.1511 0.5349
0.458 35.0 210 1.5836 0.5349
0.458 36.0 216 1.4225 0.5581
0.5542 37.0 222 1.4182 0.6047
0.5542 38.0 228 1.3407 0.5581
0.3706 39.0 234 1.4368 0.5581
0.3087 40.0 240 1.6899 0.5814
0.3087 41.0 246 1.8110 0.5116
0.3001 42.0 252 1.8097 0.5349
0.3001 43.0 258 1.8097 0.5349
0.3061 44.0 264 1.8097 0.5349
0.2986 45.0 270 1.8097 0.5349
0.2986 46.0 276 1.8097 0.5349
0.2791 47.0 282 1.8097 0.5349
0.2791 48.0 288 1.8097 0.5349
0.2908 49.0 294 1.8097 0.5349
0.2986 50.0 300 1.8097 0.5349

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
0
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/hushem_1x_beit_base_adamax_001_fold3

Finetuned
(286)
this model

Evaluation results