rtdetr-r50-fruits-finetune

This model is a fine-tuned version of PekingU/rtdetr_v2_r50vd on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 9.3146
  • Map: 0.5605
  • Map 50: 0.6949
  • Map 75: 0.5809
  • Map Small: 0.185
  • Map Medium: 0.5137
  • Map Large: 0.7447
  • Mar 1: 0.2844
  • Mar 10: 0.6124
  • Mar 100: 0.6991
  • Mar Small: 0.3366
  • Mar Medium: 0.6682
  • Mar Large: 0.8694
  • Map Apple: 0.5325
  • Mar 100 Apple: 0.6806
  • Map Banana: 0.5674
  • Mar 100 Banana: 0.7202
  • Map Grapes: 0.4649
  • Mar 100 Grapes: 0.5889
  • Map Orange: 0.5262
  • Mar 100 Orange: 0.6318
  • Map Pineapple: 0.6096
  • Mar 100 Pineapple: 0.7616
  • Map Watermelon: 0.6627
  • Mar 100 Watermelon: 0.8116

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 300
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Apple Mar 100 Apple Map Banana Mar 100 Banana Map Grapes Mar 100 Grapes Map Orange Mar 100 Orange Map Pineapple Mar 100 Pineapple Map Watermelon Mar 100 Watermelon
38.0701 1.0 750 12.8953 0.4053 0.5286 0.4302 0.1257 0.3378 0.5956 0.2559 0.5675 0.695 0.3499 0.6719 0.8689 0.3628 0.6518 0.4598 0.722 0.3455 0.581 0.4154 0.6327 0.445 0.7562 0.4031 0.8261
16.2333 2.0 1500 12.0885 0.4426 0.5638 0.4611 0.1286 0.3657 0.6298 0.2564 0.5807 0.7077 0.3537 0.6873 0.8707 0.4255 0.6936 0.3953 0.7331 0.3237 0.5837 0.4223 0.6385 0.4791 0.7591 0.6095 0.8384
15.0436 3.0 2250 10.8016 0.4825 0.6092 0.5063 0.1513 0.4499 0.6519 0.268 0.6011 0.714 0.3576 0.6866 0.8803 0.4848 0.6977 0.4365 0.7482 0.4217 0.6146 0.4796 0.6396 0.5299 0.7649 0.5427 0.819
13.9852 4.0 3000 10.5803 0.4888 0.6038 0.5093 0.1676 0.4161 0.6879 0.2684 0.6047 0.7163 0.3522 0.6937 0.8884 0.4744 0.6924 0.4716 0.7464 0.3764 0.5846 0.4892 0.6364 0.4918 0.7862 0.6295 0.8517
13.2531 5.0 3750 10.5813 0.4892 0.6097 0.5087 0.1503 0.4267 0.67 0.2686 0.6033 0.7128 0.3465 0.6768 0.8856 0.4705 0.7004 0.4945 0.7459 0.4393 0.6265 0.4817 0.6404 0.5768 0.7844 0.4723 0.779
12.7843 6.0 4500 10.5538 0.5105 0.627 0.5346 0.1718 0.4531 0.6992 0.2778 0.614 0.7248 0.3732 0.6982 0.89 0.4892 0.7042 0.4914 0.757 0.4334 0.6284 0.4971 0.6523 0.559 0.7891 0.5926 0.8176
12.4813 7.0 5250 10.3528 0.5162 0.6353 0.5369 0.1816 0.4506 0.715 0.2766 0.6133 0.7247 0.3459 0.6949 0.8991 0.4882 0.6894 0.5163 0.7549 0.4434 0.6266 0.5155 0.654 0.5526 0.8036 0.5812 0.8196
12.1635 8.0 6000 10.2182 0.5184 0.6393 0.5381 0.1545 0.4505 0.7239 0.2764 0.6017 0.7074 0.3088 0.6712 0.8904 0.4885 0.684 0.5107 0.7269 0.4415 0.6026 0.5227 0.6526 0.5255 0.7533 0.6213 0.8253
11.8101 9.0 6750 10.8047 0.5482 0.6732 0.5733 0.2043 0.4989 0.7285 0.2798 0.6223 0.7331 0.3938 0.715 0.8891 0.5186 0.711 0.5382 0.7544 0.4407 0.6226 0.5137 0.6566 0.6273 0.8011 0.651 0.8528
11.6156 10.0 7500 9.8514 0.547 0.6712 0.5727 0.2163 0.4783 0.7354 0.2818 0.6241 0.7362 0.3767 0.7134 0.8982 0.5216 0.7077 0.5341 0.7676 0.477 0.651 0.5268 0.6556 0.5995 0.8 0.6232 0.8352
11.2539 11.0 8250 10.2972 0.5279 0.6517 0.5506 0.1912 0.4905 0.7027 0.2797 0.617 0.7281 0.3743 0.7145 0.8858 0.4932 0.6932 0.5143 0.7243 0.4653 0.6433 0.5098 0.659 0.5449 0.7931 0.64 0.856
10.9801 12.0 9000 9.8289 0.5277 0.6477 0.5495 0.1891 0.457 0.7191 0.282 0.6186 0.7327 0.3811 0.7078 0.8919 0.5019 0.721 0.5441 0.7602 0.461 0.6302 0.5134 0.6537 0.5592 0.7866 0.5866 0.8443
10.792 13.0 9750 9.7453 0.5611 0.6865 0.5872 0.2055 0.5278 0.7372 0.2818 0.6263 0.7384 0.3926 0.7247 0.8958 0.5114 0.7146 0.5696 0.7477 0.4692 0.6436 0.5258 0.6599 0.619 0.7993 0.6718 0.8653
10.6075 14.0 10500 10.0674 0.5327 0.6547 0.5597 0.2109 0.4782 0.7103 0.2778 0.6115 0.7187 0.381 0.6837 0.8833 0.5085 0.7109 0.5584 0.7635 0.4573 0.6342 0.5217 0.6541 0.5923 0.771 0.5578 0.7784
10.3727 15.0 11250 9.7489 0.5564 0.6828 0.5788 0.2262 0.5108 0.7317 0.2852 0.626 0.7343 0.4145 0.7091 0.8897 0.5386 0.7263 0.5431 0.7562 0.4739 0.6311 0.5251 0.655 0.5935 0.7877 0.6644 0.8494
10.3465 16.0 12000 9.7020 0.5486 0.6735 0.5708 0.1933 0.4986 0.7405 0.2815 0.6224 0.7253 0.3667 0.6965 0.8909 0.5168 0.6999 0.5453 0.7435 0.4554 0.6134 0.5307 0.6549 0.6012 0.7913 0.6422 0.8491
10.1046 17.0 12750 9.5060 0.5466 0.6692 0.5693 0.2042 0.5009 0.7297 0.2852 0.6176 0.7236 0.3737 0.6935 0.8876 0.5131 0.7063 0.5389 0.7313 0.4666 0.6179 0.5311 0.6554 0.5802 0.7822 0.6498 0.8483
10.0658 18.0 13500 9.6207 0.5528 0.6738 0.5764 0.1977 0.4942 0.738 0.2856 0.6211 0.7295 0.3757 0.6988 0.8924 0.5104 0.703 0.5788 0.7672 0.4659 0.6329 0.5241 0.6555 0.6281 0.8004 0.6096 0.8179
9.8429 19.0 14250 9.6473 0.5442 0.6686 0.568 0.1592 0.5016 0.7304 0.2784 0.6154 0.7188 0.335 0.6993 0.8834 0.5197 0.7043 0.5429 0.7223 0.4758 0.6234 0.5134 0.6428 0.6053 0.7819 0.6081 0.8384
9.6844 20.0 15000 9.7822 0.5298 0.6493 0.5526 0.1638 0.4825 0.716 0.2835 0.6183 0.7215 0.354 0.7028 0.8842 0.4863 0.6993 0.5345 0.7523 0.457 0.6034 0.5231 0.6505 0.5736 0.7895 0.6044 0.8338
9.5344 21.0 15750 9.6779 0.5413 0.6699 0.563 0.1829 0.504 0.7287 0.2825 0.6104 0.7167 0.3543 0.6859 0.8857 0.5071 0.7046 0.5517 0.7304 0.4368 0.5952 0.5276 0.6528 0.6004 0.7786 0.6243 0.8384
9.5283 22.0 16500 9.4513 0.5527 0.6798 0.5752 0.1922 0.4983 0.7374 0.2856 0.6197 0.7241 0.3399 0.6986 0.8921 0.5164 0.7096 0.5481 0.7342 0.4783 0.6187 0.5352 0.6536 0.6259 0.7833 0.6122 0.8455
9.3026 23.0 17250 9.8924 0.5438 0.6727 0.566 0.1809 0.4847 0.7355 0.2844 0.612 0.7137 0.3566 0.6846 0.8833 0.5074 0.6965 0.5374 0.7176 0.4386 0.601 0.5245 0.6441 0.6282 0.7851 0.6265 0.8378
9.1685 24.0 18000 9.8565 0.542 0.6669 0.5661 0.1746 0.4906 0.7318 0.2808 0.6129 0.7152 0.3435 0.6809 0.8881 0.5073 0.701 0.5359 0.7153 0.4654 0.6131 0.5157 0.6418 0.6125 0.7768 0.6154 0.8432
9.1399 25.0 18750 9.6471 0.5451 0.6675 0.5711 0.1688 0.5026 0.7315 0.284 0.612 0.7127 0.3224 0.6884 0.8819 0.5259 0.7087 0.5395 0.715 0.4587 0.6059 0.5098 0.6428 0.6023 0.7768 0.6347 0.8267
9.0289 26.0 19500 9.5724 0.557 0.6841 0.5766 0.2086 0.5107 0.7367 0.2843 0.6211 0.7247 0.3792 0.7023 0.8823 0.5254 0.6991 0.5637 0.7459 0.4627 0.6197 0.5237 0.6474 0.6209 0.7862 0.6456 0.8497
8.9333 27.0 20250 9.5939 0.5613 0.693 0.5848 0.2057 0.5249 0.7396 0.281 0.6218 0.7246 0.3898 0.7032 0.8797 0.5296 0.7126 0.5669 0.7448 0.466 0.6061 0.5311 0.6542 0.608 0.7678 0.6664 0.8619
8.8533 28.0 21000 9.3918 0.5675 0.6971 0.5937 0.1952 0.5295 0.7464 0.2871 0.62 0.7207 0.3738 0.6952 0.8803 0.5417 0.7062 0.564 0.7391 0.4679 0.6101 0.5283 0.6456 0.6364 0.7786 0.6668 0.8446
8.8027 29.0 21750 9.7049 0.5481 0.6712 0.5717 0.1715 0.504 0.7358 0.2841 0.6162 0.7133 0.3474 0.685 0.8844 0.4978 0.6834 0.5561 0.7328 0.4639 0.5924 0.5154 0.6462 0.6141 0.7815 0.6414 0.8438
8.6288 30.0 22500 9.7506 0.5499 0.6741 0.5737 0.1703 0.4933 0.7392 0.2852 0.617 0.7096 0.3378 0.6841 0.8771 0.5164 0.688 0.5545 0.7331 0.4666 0.5878 0.5032 0.6347 0.6209 0.7764 0.638 0.8375
8.5941 31.0 23250 9.4781 0.545 0.6737 0.5698 0.1708 0.4956 0.7317 0.2854 0.6144 0.7094 0.3638 0.6792 0.8739 0.5007 0.6855 0.5571 0.7305 0.4773 0.6097 0.5136 0.6361 0.5776 0.7641 0.6435 0.8307
8.4333 32.0 24000 9.4673 0.5469 0.676 0.57 0.19 0.4888 0.7357 0.2828 0.6136 0.7049 0.3491 0.6717 0.8742 0.5352 0.6931 0.5523 0.7277 0.465 0.5871 0.509 0.6309 0.596 0.7696 0.6239 0.8213
8.3901 33.0 24750 9.4711 0.5478 0.6767 0.5743 0.1655 0.4894 0.7399 0.2827 0.6153 0.7086 0.342 0.6823 0.8744 0.5215 0.6981 0.5565 0.7333 0.4664 0.5968 0.5204 0.6405 0.59 0.7587 0.6319 0.8244
8.2694 34.0 25500 9.4160 0.5591 0.6874 0.5817 0.1666 0.5147 0.7451 0.2859 0.6192 0.7136 0.3636 0.6854 0.8785 0.5305 0.6947 0.5576 0.7286 0.4737 0.6068 0.526 0.6412 0.6089 0.7764 0.658 0.8341
8.2259 35.0 26250 9.4956 0.5547 0.6819 0.5749 0.1617 0.4974 0.7442 0.286 0.6173 0.7117 0.3353 0.6845 0.8788 0.5271 0.7009 0.5654 0.7309 0.469 0.6033 0.5269 0.6415 0.6051 0.7696 0.6344 0.8239
8.0849 36.0 27000 9.4359 0.549 0.6764 0.568 0.1642 0.4918 0.739 0.2842 0.6134 0.6999 0.3197 0.6663 0.8753 0.5115 0.6845 0.5707 0.7357 0.4628 0.587 0.5196 0.6279 0.6033 0.7587 0.6259 0.8057
8.121 37.0 27750 9.4808 0.5499 0.6809 0.5709 0.1673 0.4902 0.7379 0.2825 0.6117 0.7036 0.3445 0.668 0.8735 0.5216 0.6906 0.5583 0.726 0.4643 0.598 0.5218 0.6352 0.6002 0.7667 0.6335 0.8054
8.0253 38.0 28500 9.4835 0.5485 0.6818 0.5699 0.1751 0.4982 0.7364 0.2815 0.6075 0.6957 0.3254 0.66 0.8701 0.5278 0.6809 0.5436 0.7141 0.4618 0.5864 0.5242 0.6315 0.588 0.7525 0.6454 0.8088
7.9279 39.0 29250 9.2876 0.5605 0.6937 0.5804 0.1972 0.5068 0.7463 0.2838 0.6173 0.7057 0.3511 0.6749 0.8742 0.5394 0.6951 0.5655 0.726 0.47 0.5986 0.5225 0.6313 0.612 0.7638 0.6533 0.8196
7.949 40.0 30000 9.4416 0.5555 0.6861 0.5779 0.1839 0.5045 0.7419 0.2845 0.6097 0.7002 0.3419 0.6691 0.8715 0.5195 0.6796 0.5593 0.7295 0.4681 0.5889 0.5226 0.6325 0.6166 0.7514 0.6471 0.819
7.9158 41.0 30750 9.2813 0.5639 0.6968 0.5863 0.195 0.5181 0.7464 0.2854 0.6163 0.7078 0.36 0.6808 0.8722 0.537 0.6955 0.576 0.7354 0.4683 0.5953 0.5287 0.6381 0.6163 0.7591 0.6572 0.8236
7.822 42.0 31500 9.3406 0.5631 0.6945 0.5849 0.1913 0.5141 0.7472 0.2846 0.6172 0.7068 0.3523 0.6772 0.8764 0.538 0.6911 0.5648 0.7336 0.4648 0.5882 0.528 0.6356 0.6237 0.7707 0.6592 0.8219
7.7186 43.0 32250 9.3332 0.5594 0.691 0.583 0.1926 0.5096 0.7445 0.2851 0.6133 0.7023 0.3454 0.6705 0.871 0.5336 0.685 0.5676 0.7331 0.4605 0.5908 0.5287 0.6345 0.6083 0.7594 0.6579 0.8108
7.6081 44.0 33000 9.4212 0.5639 0.6941 0.5854 0.1879 0.5156 0.7484 0.2856 0.6148 0.7001 0.3367 0.6708 0.869 0.5348 0.6873 0.571 0.7292 0.4661 0.5886 0.5231 0.6323 0.6257 0.7587 0.6628 0.8045
7.648 45.0 33750 9.3147 0.5614 0.6932 0.5833 0.1819 0.5142 0.7456 0.2853 0.6133 0.7011 0.3368 0.6708 0.8704 0.5375 0.6939 0.5686 0.7246 0.4645 0.5893 0.5271 0.633 0.6097 0.7634 0.6608 0.8023
7.6019 46.0 34500 9.2919 0.5638 0.6967 0.5843 0.1959 0.5135 0.7485 0.2851 0.6162 0.7025 0.35 0.6722 0.869 0.5388 0.6865 0.5762 0.7325 0.4681 0.5912 0.5276 0.6334 0.6139 0.7572 0.6583 0.8139
7.502 47.0 35250 9.3337 0.5604 0.6944 0.5822 0.1825 0.5094 0.7468 0.2849 0.613 0.6975 0.339 0.6644 0.8677 0.5336 0.6804 0.5705 0.7261 0.4618 0.5829 0.5261 0.633 0.6169 0.7594 0.6535 0.8031
7.4597 48.0 36000 9.3463 0.5602 0.6946 0.5809 0.1891 0.5127 0.7442 0.2839 0.6129 0.6971 0.3426 0.6644 0.8668 0.5312 0.6777 0.5718 0.7191 0.4634 0.5843 0.5271 0.6326 0.6055 0.7576 0.662 0.8111
7.5524 49.0 36750 9.3212 0.561 0.6948 0.5808 0.1859 0.5141 0.7447 0.2841 0.6111 0.6992 0.3404 0.6666 0.8691 0.5317 0.6761 0.5663 0.7225 0.4663 0.5878 0.5273 0.6324 0.612 0.7656 0.6627 0.8111
7.5098 50.0 37500 9.3146 0.5605 0.6949 0.5809 0.185 0.5137 0.7447 0.2844 0.6124 0.6991 0.3366 0.6682 0.8694 0.5325 0.6806 0.5674 0.7202 0.4649 0.5889 0.5262 0.6318 0.6096 0.7616 0.6627 0.8116

Framework versions

  • Transformers 4.53.0.dev0
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
4
Safetensors
Model size
42.9M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for hungnguyen2k4/rtdetr-r50-fruits-finetune

Finetuned
(14)
this model
Finetunes
1 model