Edit model card

hushem_5x_beit_base_sgd_00001_fold2

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5367
  • Accuracy: 0.2667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.5006 1.0 27 1.5552 0.2667
1.5759 2.0 54 1.5543 0.2667
1.5707 3.0 81 1.5535 0.2667
1.578 4.0 108 1.5527 0.2667
1.5119 5.0 135 1.5520 0.2667
1.5352 6.0 162 1.5512 0.2667
1.5348 7.0 189 1.5504 0.2667
1.5693 8.0 216 1.5497 0.2667
1.5386 9.0 243 1.5490 0.2667
1.5189 10.0 270 1.5483 0.2667
1.5597 11.0 297 1.5477 0.2667
1.5706 12.0 324 1.5471 0.2667
1.5157 13.0 351 1.5465 0.2667
1.5457 14.0 378 1.5458 0.2667
1.5087 15.0 405 1.5453 0.2667
1.5323 16.0 432 1.5447 0.2667
1.5363 17.0 459 1.5442 0.2667
1.5615 18.0 486 1.5437 0.2667
1.5236 19.0 513 1.5433 0.2667
1.566 20.0 540 1.5428 0.2667
1.5446 21.0 567 1.5424 0.2667
1.5289 22.0 594 1.5419 0.2667
1.4823 23.0 621 1.5415 0.2667
1.5025 24.0 648 1.5411 0.2667
1.5362 25.0 675 1.5407 0.2667
1.5593 26.0 702 1.5404 0.2667
1.5515 27.0 729 1.5401 0.2667
1.5275 28.0 756 1.5397 0.2667
1.5171 29.0 783 1.5394 0.2667
1.5816 30.0 810 1.5391 0.2667
1.5294 31.0 837 1.5389 0.2667
1.5276 32.0 864 1.5386 0.2667
1.5584 33.0 891 1.5384 0.2667
1.5549 34.0 918 1.5382 0.2667
1.4864 35.0 945 1.5380 0.2667
1.4851 36.0 972 1.5378 0.2667
1.4835 37.0 999 1.5376 0.2667
1.5708 38.0 1026 1.5374 0.2667
1.5448 39.0 1053 1.5373 0.2667
1.4945 40.0 1080 1.5372 0.2667
1.486 41.0 1107 1.5371 0.2667
1.5082 42.0 1134 1.5370 0.2667
1.5323 43.0 1161 1.5369 0.2667
1.4965 44.0 1188 1.5368 0.2667
1.5407 45.0 1215 1.5368 0.2667
1.5084 46.0 1242 1.5368 0.2667
1.5191 47.0 1269 1.5367 0.2667
1.5617 48.0 1296 1.5367 0.2667
1.4992 49.0 1323 1.5367 0.2667
1.4782 50.0 1350 1.5367 0.2667

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/hushem_5x_beit_base_sgd_00001_fold2

Finetuned
(286)
this model

Evaluation results