train_multirc_1745950268
This model is a fine-tuned version of mistralai/Mistral-7B-Instruct-v0.3 on the multirc dataset. It achieves the following results on the evaluation set:
- Loss: 0.3137
- Num Input Tokens Seen: 83543088
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.3
- train_batch_size: 2
- eval_batch_size: 2
- seed: 123
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 40000
Training results
| Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
|---|---|---|---|---|
| 0.3351 | 0.0326 | 200 | 0.3450 | 418192 |
| 0.342 | 0.0653 | 400 | 0.3405 | 836224 |
| 0.3445 | 0.0979 | 600 | 0.3395 | 1258320 |
| 0.3385 | 0.1305 | 800 | 0.3819 | 1673984 |
| 0.3856 | 0.1631 | 1000 | 0.3446 | 2097344 |
| 0.3948 | 0.1958 | 1200 | 0.3347 | 2515056 |
| 0.3097 | 0.2284 | 1400 | 0.3262 | 2943280 |
| 0.2916 | 0.2610 | 1600 | 0.3242 | 3360448 |
| 0.2844 | 0.2937 | 1800 | 0.3232 | 3782768 |
| 0.3879 | 0.3263 | 2000 | 0.3216 | 4205680 |
| 0.3318 | 0.3589 | 2200 | 0.3303 | 4620944 |
| 0.3499 | 0.3915 | 2400 | 0.3219 | 5037232 |
| 0.354 | 0.4242 | 2600 | 0.3206 | 5452736 |
| 0.5992 | 0.4568 | 2800 | 0.3763 | 5872752 |
| 0.3572 | 0.4894 | 3000 | 0.3608 | 6285232 |
| 0.3438 | 0.5221 | 3200 | 0.3525 | 6699264 |
| 0.3398 | 0.5547 | 3400 | 0.3392 | 7118336 |
| 0.3375 | 0.5873 | 3600 | 0.3408 | 7533408 |
| 0.3466 | 0.6200 | 3800 | 0.3404 | 7950560 |
| 0.3448 | 0.6526 | 4000 | 0.3365 | 8372672 |
| 0.3478 | 0.6852 | 4200 | 0.3410 | 8796048 |
| 0.3384 | 0.7178 | 4400 | 0.3222 | 9210416 |
| 0.3026 | 0.7505 | 4600 | 0.3228 | 9628832 |
| 0.3646 | 0.7831 | 4800 | 0.3193 | 10048144 |
| 0.3149 | 0.8157 | 5000 | 0.3247 | 10460144 |
| 0.3029 | 0.8484 | 5200 | 0.3176 | 10871296 |
| 0.2841 | 0.8810 | 5400 | 0.3503 | 11287600 |
| 0.3657 | 0.9136 | 5600 | 0.3254 | 11707328 |
| 0.3327 | 0.9462 | 5800 | 0.3238 | 12120624 |
| 0.3498 | 0.9789 | 6000 | 0.3214 | 12542416 |
| 0.3557 | 1.0114 | 6200 | 0.3179 | 12963008 |
| 0.3641 | 1.0440 | 6400 | 0.3230 | 13388336 |
| 0.3471 | 1.0767 | 6600 | 0.3301 | 13816224 |
| 0.3218 | 1.1093 | 6800 | 0.3167 | 14228240 |
| 0.3316 | 1.1419 | 7000 | 0.3166 | 14637984 |
| 0.2862 | 1.1746 | 7200 | 0.3166 | 15049216 |
| 0.2954 | 1.2072 | 7400 | 0.3434 | 15471984 |
| 0.3416 | 1.2398 | 7600 | 0.3423 | 15891152 |
| 0.3428 | 1.2725 | 7800 | 0.3388 | 16309376 |
| 0.3515 | 1.3051 | 8000 | 0.3350 | 16729632 |
| 0.4186 | 1.3377 | 8200 | 0.3455 | 17139952 |
| 0.3117 | 1.3703 | 8400 | 0.3229 | 17557136 |
| 0.3217 | 1.4030 | 8600 | 0.3417 | 17974864 |
| 0.289 | 1.4356 | 8800 | 0.3266 | 18394320 |
| 0.3238 | 1.4682 | 9000 | 0.3216 | 18820208 |
| 0.4018 | 1.5009 | 9200 | 0.3191 | 19244192 |
| 0.3232 | 1.5335 | 9400 | 0.3441 | 19654192 |
| 0.2839 | 1.5661 | 9600 | 0.3351 | 20077520 |
| 0.2611 | 1.5987 | 9800 | 0.3176 | 20493344 |
| 0.3499 | 1.6314 | 10000 | 0.3219 | 20912896 |
| 0.3144 | 1.6640 | 10200 | 0.3200 | 21328976 |
| 0.3449 | 1.6966 | 10400 | 0.3174 | 21752192 |
| 0.2893 | 1.7293 | 10600 | 0.3174 | 22164912 |
| 0.2869 | 1.7619 | 10800 | 0.3182 | 22585216 |
| 0.3774 | 1.7945 | 11000 | 0.3323 | 23005600 |
| 0.3332 | 1.8271 | 11200 | 0.3212 | 23413712 |
| 0.3697 | 1.8598 | 11400 | 0.3198 | 23827536 |
| 0.351 | 1.8924 | 11600 | 0.3186 | 24242256 |
| 0.3605 | 1.9250 | 11800 | 0.3276 | 24655408 |
| 0.2571 | 1.9577 | 12000 | 0.3231 | 25074096 |
| 0.3465 | 1.9903 | 12200 | 0.3186 | 25489056 |
| 0.4011 | 2.0228 | 12400 | 0.3195 | 25898496 |
| 0.3619 | 2.0555 | 12600 | 0.3205 | 26319696 |
| 0.3846 | 2.0881 | 12800 | 0.3186 | 26744896 |
| 0.3307 | 2.1207 | 13000 | 0.3362 | 27166064 |
| 0.3568 | 2.1534 | 13200 | 0.3260 | 27581264 |
| 0.3308 | 2.1860 | 13400 | 0.3247 | 27988880 |
| 0.3383 | 2.2186 | 13600 | 0.3243 | 28397472 |
| 0.3414 | 2.2512 | 13800 | 0.3222 | 28812800 |
| 0.3195 | 2.2839 | 14000 | 0.3187 | 29222656 |
| 0.3708 | 2.3165 | 14200 | 0.3186 | 29642224 |
| 0.3423 | 2.3491 | 14400 | 0.3270 | 30064704 |
| 0.3308 | 2.3818 | 14600 | 0.3241 | 30481488 |
| 0.3559 | 2.4144 | 14800 | 0.3218 | 30900976 |
| 0.308 | 2.4470 | 15000 | 0.3212 | 31321184 |
| 0.3841 | 2.4796 | 15200 | 0.3172 | 31730928 |
| 0.3464 | 2.5123 | 15400 | 0.3176 | 32146304 |
| 0.3327 | 2.5449 | 15600 | 0.3167 | 32566096 |
| 0.2851 | 2.5775 | 15800 | 0.3196 | 32981664 |
| 0.2903 | 2.6102 | 16000 | 0.3185 | 33403328 |
| 0.3954 | 2.6428 | 16200 | 0.3187 | 33827808 |
| 0.2707 | 2.6754 | 16400 | 0.3170 | 34245456 |
| 0.3526 | 2.7081 | 16600 | 0.3249 | 34673616 |
| 0.3808 | 2.7407 | 16800 | 0.3165 | 35089872 |
| 0.343 | 2.7733 | 17000 | 0.3178 | 35508944 |
| 0.3599 | 2.8059 | 17200 | 0.3174 | 35922144 |
| 0.3173 | 2.8386 | 17400 | 0.3168 | 36345856 |
| 0.3271 | 2.8712 | 17600 | 0.3186 | 36770688 |
| 0.3495 | 2.9038 | 17800 | 0.3218 | 37194864 |
| 0.2759 | 2.9365 | 18000 | 0.3172 | 37615344 |
| 0.3311 | 2.9691 | 18200 | 0.3163 | 38030400 |
| 0.2723 | 3.0016 | 18400 | 0.3174 | 38435312 |
| 0.2748 | 3.0343 | 18600 | 0.3171 | 38869040 |
| 0.3487 | 3.0669 | 18800 | 0.3223 | 39294832 |
| 0.3459 | 3.0995 | 19000 | 0.3184 | 39706928 |
| 0.3048 | 3.1321 | 19200 | 0.3172 | 40121472 |
| 0.373 | 3.1648 | 19400 | 0.3201 | 40537792 |
| 0.3033 | 3.1974 | 19600 | 0.3187 | 40958928 |
| 0.3176 | 3.2300 | 19800 | 0.3198 | 41378128 |
| 0.3015 | 3.2627 | 20000 | 0.3172 | 41794480 |
| 0.3611 | 3.2953 | 20200 | 0.3212 | 42208144 |
| 0.3039 | 3.3279 | 20400 | 0.3169 | 42625520 |
| 0.3327 | 3.3606 | 20600 | 0.3159 | 43054848 |
| 0.338 | 3.3932 | 20800 | 0.3215 | 43472928 |
| 0.3392 | 3.4258 | 21000 | 0.3171 | 43892704 |
| 0.3119 | 3.4584 | 21200 | 0.3145 | 44309408 |
| 0.3679 | 3.4911 | 21400 | 0.3164 | 44724144 |
| 0.3277 | 3.5237 | 21600 | 0.3268 | 45143632 |
| 0.2847 | 3.5563 | 21800 | 0.3167 | 45567152 |
| 0.3947 | 3.5890 | 22000 | 0.3164 | 45983168 |
| 0.3045 | 3.6216 | 22200 | 0.3189 | 46401184 |
| 0.32 | 3.6542 | 22400 | 0.3202 | 46813008 |
| 0.3964 | 3.6868 | 22600 | 0.3141 | 47233968 |
| 0.2685 | 3.7195 | 22800 | 0.3139 | 47650016 |
| 0.313 | 3.7521 | 23000 | 0.3161 | 48064160 |
| 0.3128 | 3.7847 | 23200 | 0.3145 | 48484384 |
| 0.3507 | 3.8174 | 23400 | 0.3210 | 48897744 |
| 0.3455 | 3.8500 | 23600 | 0.3147 | 49308304 |
| 0.3713 | 3.8826 | 23800 | 0.3209 | 49728368 |
| 0.3627 | 3.9152 | 24000 | 0.3199 | 50140272 |
| 0.2905 | 3.9479 | 24200 | 0.3160 | 50557680 |
| 0.3251 | 3.9805 | 24400 | 0.3162 | 50978512 |
| 0.3528 | 4.0131 | 24600 | 0.3200 | 51394160 |
| 0.2819 | 4.0457 | 24800 | 0.3155 | 51821712 |
| 0.2867 | 4.0783 | 25000 | 0.3199 | 52244608 |
| 0.3124 | 4.1109 | 25200 | 0.3161 | 52659888 |
| 0.2975 | 4.1436 | 25400 | 0.3185 | 53073648 |
| 0.4165 | 4.1762 | 25600 | 0.3148 | 53493696 |
| 0.3362 | 4.2088 | 25800 | 0.3158 | 53907648 |
| 0.258 | 4.2415 | 26000 | 0.3151 | 54327568 |
| 0.3329 | 4.2741 | 26200 | 0.3233 | 54743840 |
| 0.3172 | 4.3067 | 26400 | 0.3192 | 55158912 |
| 0.3416 | 4.3393 | 26600 | 0.3192 | 55575232 |
| 0.4005 | 4.3720 | 26800 | 0.3155 | 55994160 |
| 0.3505 | 4.4046 | 27000 | 0.3155 | 56410736 |
| 0.3187 | 4.4372 | 27200 | 0.3175 | 56838864 |
| 0.339 | 4.4699 | 27400 | 0.3153 | 57245776 |
| 0.3405 | 4.5025 | 27600 | 0.3173 | 57651824 |
| 0.2684 | 4.5351 | 27800 | 0.3176 | 58060672 |
| 0.3248 | 4.5677 | 28000 | 0.3164 | 58475488 |
| 0.3213 | 4.6004 | 28200 | 0.3177 | 58898896 |
| 0.3105 | 4.6330 | 28400 | 0.3156 | 59318976 |
| 0.3098 | 4.6656 | 28600 | 0.3160 | 59739680 |
| 0.4193 | 4.6983 | 28800 | 0.3144 | 60159984 |
| 0.3198 | 4.7309 | 29000 | 0.3170 | 60579344 |
| 0.2599 | 4.7635 | 29200 | 0.3177 | 60992736 |
| 0.2971 | 4.7961 | 29400 | 0.3152 | 61414080 |
| 0.2985 | 4.8288 | 29600 | 0.3153 | 61829776 |
| 0.3175 | 4.8614 | 29800 | 0.3161 | 62250704 |
| 0.3234 | 4.8940 | 30000 | 0.3151 | 62662656 |
| 0.3708 | 4.9267 | 30200 | 0.3170 | 63088352 |
| 0.2391 | 4.9593 | 30400 | 0.3180 | 63504960 |
| 0.3714 | 4.9919 | 30600 | 0.3148 | 63926432 |
| 0.32 | 5.0245 | 30800 | 0.3143 | 64346032 |
| 0.3243 | 5.0571 | 31000 | 0.3148 | 64764608 |
| 0.3121 | 5.0897 | 31200 | 0.3152 | 65180560 |
| 0.3838 | 5.1224 | 31400 | 0.3143 | 65600032 |
| 0.2727 | 5.1550 | 31600 | 0.3137 | 66007440 |
| 0.3052 | 5.1876 | 31800 | 0.3138 | 66416480 |
| 0.3068 | 5.2202 | 32000 | 0.3147 | 66829712 |
| 0.294 | 5.2529 | 32200 | 0.3163 | 67253936 |
| 0.3119 | 5.2855 | 32400 | 0.3145 | 67674048 |
| 0.3581 | 5.3181 | 32600 | 0.3155 | 68096656 |
| 0.3607 | 5.3508 | 32800 | 0.3137 | 68521600 |
| 0.2925 | 5.3834 | 33000 | 0.3154 | 68948064 |
| 0.2777 | 5.4160 | 33200 | 0.3145 | 69357008 |
| 0.2747 | 5.4486 | 33400 | 0.3146 | 69771824 |
| 0.3595 | 5.4813 | 33600 | 0.3149 | 70189824 |
| 0.2622 | 5.5139 | 33800 | 0.3145 | 70602704 |
| 0.3138 | 5.5465 | 34000 | 0.3137 | 71032768 |
| 0.2645 | 5.5792 | 34200 | 0.3148 | 71445488 |
| 0.283 | 5.6118 | 34400 | 0.3143 | 71858096 |
| 0.3269 | 5.6444 | 34600 | 0.3151 | 72276272 |
| 0.3394 | 5.6771 | 34800 | 0.3149 | 72694032 |
| 0.3687 | 5.7097 | 35000 | 0.3139 | 73119856 |
| 0.33 | 5.7423 | 35200 | 0.3146 | 73537984 |
| 0.314 | 5.7749 | 35400 | 0.3152 | 73955216 |
| 0.2967 | 5.8076 | 35600 | 0.3164 | 74371040 |
| 0.2874 | 5.8402 | 35800 | 0.3169 | 74795680 |
| 0.2811 | 5.8728 | 36000 | 0.3159 | 75209824 |
| 0.2885 | 5.9055 | 36200 | 0.3155 | 75634096 |
| 0.3083 | 5.9381 | 36400 | 0.3161 | 76046144 |
| 0.2806 | 5.9707 | 36600 | 0.3157 | 76453936 |
| 0.2728 | 6.0033 | 36800 | 0.3154 | 76873152 |
| 0.357 | 6.0359 | 37000 | 0.3162 | 77290000 |
| 0.32 | 6.0685 | 37200 | 0.3154 | 77708416 |
| 0.3433 | 6.1012 | 37400 | 0.3155 | 78124432 |
| 0.252 | 6.1338 | 37600 | 0.3155 | 78542400 |
| 0.3092 | 6.1664 | 37800 | 0.3150 | 78968368 |
| 0.2739 | 6.1990 | 38000 | 0.3149 | 79378528 |
| 0.3555 | 6.2317 | 38200 | 0.3149 | 79802112 |
| 0.2865 | 6.2643 | 38400 | 0.3151 | 80229344 |
| 0.3473 | 6.2969 | 38600 | 0.3154 | 80643632 |
| 0.2548 | 6.3296 | 38800 | 0.3150 | 81051936 |
| 0.2403 | 6.3622 | 39000 | 0.3153 | 81475504 |
| 0.3694 | 6.3948 | 39200 | 0.3151 | 81889856 |
| 0.3268 | 6.4274 | 39400 | 0.3150 | 82305408 |
| 0.2828 | 6.4601 | 39600 | 0.3150 | 82712800 |
| 0.3382 | 6.4927 | 39800 | 0.3152 | 83128704 |
| 0.3072 | 6.5253 | 40000 | 0.3154 | 83543088 |
Framework versions
- PEFT 0.15.2.dev0
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for rbelanec/train_multirc_1745950268
Base model
mistralai/Mistral-7B-v0.3
Finetuned
mistralai/Mistral-7B-Instruct-v0.3