aborcs commited on
Commit
ae54255
·
verified ·
1 Parent(s): 4d20436

Model save

Browse files
README.md CHANGED
@@ -18,7 +18,7 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.9587
22
 
23
  ## Model description
24
 
@@ -38,465 +38,70 @@ More information needed
38
 
39
  The following hyperparameters were used during training:
40
  - learning_rate: 0.0002
41
- - train_batch_size: 8
42
  - eval_batch_size: 8
43
  - seed: 42
44
- - gradient_accumulation_steps: 8
45
- - total_train_batch_size: 64
46
  - optimizer: Use paged_adamw_32bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
47
  - lr_scheduler_type: cosine
48
- - training_steps: 500
49
  - mixed_precision_training: Native AMP
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss |
54
- |:-------------:|:--------:|:----:|:---------------:|
55
- | No log | 0.8889 | 1 | 1.1457 |
56
- | No log | 1.7778 | 2 | 1.1356 |
57
- | No log | 2.6667 | 3 | 1.1222 |
58
- | No log | 3.5556 | 4 | 1.1062 |
59
- | No log | 4.4444 | 5 | 1.0881 |
60
- | No log | 5.3333 | 6 | 1.0701 |
61
- | No log | 6.2222 | 7 | 1.0519 |
62
- | No log | 8.0 | 9 | 1.0126 |
63
- | 2.4728 | 8.8889 | 10 | 0.9951 |
64
- | 2.4728 | 9.7778 | 11 | 0.9722 |
65
- | 2.4728 | 10.6667 | 12 | 0.9471 |
66
- | 2.4728 | 11.5556 | 13 | 0.9254 |
67
- | 2.4728 | 12.4444 | 14 | 0.9064 |
68
- | 2.4728 | 13.3333 | 15 | 0.8876 |
69
- | 2.4728 | 14.2222 | 16 | 0.8741 |
70
- | 2.4728 | 16.0 | 18 | 0.8464 |
71
- | 2.4728 | 16.8889 | 19 | 0.8320 |
72
- | 1.8448 | 17.7778 | 20 | 0.8162 |
73
- | 1.8448 | 18.6667 | 21 | 0.8009 |
74
- | 1.8448 | 19.5556 | 22 | 0.7870 |
75
- | 1.8448 | 20.4444 | 23 | 0.7739 |
76
- | 1.8448 | 21.3333 | 24 | 0.7632 |
77
- | 1.8448 | 22.2222 | 25 | 0.7529 |
78
- | 1.8448 | 24.0 | 27 | 0.7297 |
79
- | 1.8448 | 24.8889 | 28 | 0.7181 |
80
- | 1.8448 | 25.7778 | 29 | 0.7054 |
81
- | 1.5787 | 26.6667 | 30 | 0.6937 |
82
- | 1.5787 | 27.5556 | 31 | 0.6830 |
83
- | 1.5787 | 28.4444 | 32 | 0.6731 |
84
- | 1.5787 | 29.3333 | 33 | 0.6672 |
85
- | 1.5787 | 30.2222 | 34 | 0.6591 |
86
- | 1.5787 | 32.0 | 36 | 0.6414 |
87
- | 1.5787 | 32.8889 | 37 | 0.6354 |
88
- | 1.5787 | 33.7778 | 38 | 0.6281 |
89
- | 1.5787 | 34.6667 | 39 | 0.6197 |
90
- | 1.2897 | 35.5556 | 40 | 0.6116 |
91
- | 1.2897 | 36.4444 | 41 | 0.6046 |
92
- | 1.2897 | 37.3333 | 42 | 0.5934 |
93
- | 1.2897 | 38.2222 | 43 | 0.5837 |
94
- | 1.2897 | 40.0 | 45 | 0.5710 |
95
- | 1.2897 | 40.8889 | 46 | 0.5662 |
96
- | 1.2897 | 41.7778 | 47 | 0.5622 |
97
- | 1.2897 | 42.6667 | 48 | 0.5590 |
98
- | 1.2897 | 43.5556 | 49 | 0.5571 |
99
- | 1.1896 | 44.4444 | 50 | 0.5500 |
100
- | 1.1896 | 45.3333 | 51 | 0.5449 |
101
- | 1.1896 | 46.2222 | 52 | 0.5424 |
102
- | 1.1896 | 48.0 | 54 | 0.5455 |
103
- | 1.1896 | 48.8889 | 55 | 0.5328 |
104
- | 1.1896 | 49.7778 | 56 | 0.5244 |
105
- | 1.1896 | 50.6667 | 57 | 0.5209 |
106
- | 1.1896 | 51.5556 | 58 | 0.5228 |
107
- | 1.1896 | 52.4444 | 59 | 0.5291 |
108
- | 0.9397 | 53.3333 | 60 | 0.5245 |
109
- | 0.9397 | 54.2222 | 61 | 0.5074 |
110
- | 0.9397 | 56.0 | 63 | 0.4977 |
111
- | 0.9397 | 56.8889 | 64 | 0.5034 |
112
- | 0.9397 | 57.7778 | 65 | 0.5238 |
113
- | 0.9397 | 58.6667 | 66 | 0.5195 |
114
- | 0.9397 | 59.5556 | 67 | 0.4984 |
115
- | 0.9397 | 60.4444 | 68 | 0.4892 |
116
- | 0.9397 | 61.3333 | 69 | 0.4901 |
117
- | 0.7877 | 62.2222 | 70 | 0.5069 |
118
- | 0.7877 | 64.0 | 72 | 0.5106 |
119
- | 0.7877 | 64.8889 | 73 | 0.4894 |
120
- | 0.7877 | 65.7778 | 74 | 0.4801 |
121
- | 0.7877 | 66.6667 | 75 | 0.4827 |
122
- | 0.7877 | 67.5556 | 76 | 0.4974 |
123
- | 0.7877 | 68.4444 | 77 | 0.5117 |
124
- | 0.7877 | 69.3333 | 78 | 0.4973 |
125
- | 0.7877 | 70.2222 | 79 | 0.4794 |
126
- | 0.5853 | 72.0 | 81 | 0.4952 |
127
- | 0.5853 | 72.8889 | 82 | 0.5233 |
128
- | 0.5853 | 73.7778 | 83 | 0.5292 |
129
- | 0.5853 | 74.6667 | 84 | 0.5096 |
130
- | 0.5853 | 75.5556 | 85 | 0.4991 |
131
- | 0.5853 | 76.4444 | 86 | 0.5015 |
132
- | 0.5853 | 77.3333 | 87 | 0.5226 |
133
- | 0.5853 | 78.2222 | 88 | 0.5266 |
134
- | 0.5103 | 80.0 | 90 | 0.4979 |
135
- | 0.5103 | 80.8889 | 91 | 0.5042 |
136
- | 0.5103 | 81.7778 | 92 | 0.5195 |
137
- | 0.5103 | 82.6667 | 93 | 0.5365 |
138
- | 0.5103 | 83.5556 | 94 | 0.5199 |
139
- | 0.5103 | 84.4444 | 95 | 0.5004 |
140
- | 0.5103 | 85.3333 | 96 | 0.5122 |
141
- | 0.5103 | 86.2222 | 97 | 0.5463 |
142
- | 0.5103 | 88.0 | 99 | 0.5554 |
143
- | 0.3942 | 88.8889 | 100 | 0.5315 |
144
- | 0.3942 | 89.7778 | 101 | 0.5231 |
145
- | 0.3942 | 90.6667 | 102 | 0.5432 |
146
- | 0.3942 | 91.5556 | 103 | 0.5688 |
147
- | 0.3942 | 92.4444 | 104 | 0.5763 |
148
- | 0.3942 | 93.3333 | 105 | 0.5723 |
149
- | 0.3942 | 94.2222 | 106 | 0.5726 |
150
- | 0.3942 | 96.0 | 108 | 0.5975 |
151
- | 0.3942 | 96.8889 | 109 | 0.5834 |
152
- | 0.3182 | 97.7778 | 110 | 0.5887 |
153
- | 0.3182 | 98.6667 | 111 | 0.6124 |
154
- | 0.3182 | 99.5556 | 112 | 0.6188 |
155
- | 0.3182 | 100.4444 | 113 | 0.6067 |
156
- | 0.3182 | 101.3333 | 114 | 0.5717 |
157
- | 0.3182 | 102.2222 | 115 | 0.5682 |
158
- | 0.3182 | 104.0 | 117 | 0.6115 |
159
- | 0.3182 | 104.8889 | 118 | 0.6088 |
160
- | 0.3182 | 105.7778 | 119 | 0.5980 |
161
- | 0.2617 | 106.6667 | 120 | 0.5982 |
162
- | 0.2617 | 107.5556 | 121 | 0.6232 |
163
- | 0.2617 | 108.4444 | 122 | 0.6556 |
164
- | 0.2617 | 109.3333 | 123 | 0.6477 |
165
- | 0.2617 | 110.2222 | 124 | 0.6125 |
166
- | 0.2617 | 112.0 | 126 | 0.6101 |
167
- | 0.2617 | 112.8889 | 127 | 0.6476 |
168
- | 0.2617 | 113.7778 | 128 | 0.6484 |
169
- | 0.2617 | 114.6667 | 129 | 0.6408 |
170
- | 0.2295 | 115.5556 | 130 | 0.6301 |
171
- | 0.2295 | 116.4444 | 131 | 0.6441 |
172
- | 0.2295 | 117.3333 | 132 | 0.6659 |
173
- | 0.2295 | 118.2222 | 133 | 0.6769 |
174
- | 0.2295 | 120.0 | 135 | 0.6932 |
175
- | 0.2295 | 120.8889 | 136 | 0.6739 |
176
- | 0.2295 | 121.7778 | 137 | 0.6569 |
177
- | 0.2295 | 122.6667 | 138 | 0.6650 |
178
- | 0.2295 | 123.5556 | 139 | 0.6787 |
179
- | 0.185 | 124.4444 | 140 | 0.7024 |
180
- | 0.185 | 125.3333 | 141 | 0.7004 |
181
- | 0.185 | 126.2222 | 142 | 0.6752 |
182
- | 0.185 | 128.0 | 144 | 0.7133 |
183
- | 0.185 | 128.8889 | 145 | 0.7318 |
184
- | 0.185 | 129.7778 | 146 | 0.7230 |
185
- | 0.185 | 130.6667 | 147 | 0.7056 |
186
- | 0.185 | 131.5556 | 148 | 0.7121 |
187
- | 0.185 | 132.4444 | 149 | 0.7213 |
188
- | 0.1528 | 133.3333 | 150 | 0.7434 |
189
- | 0.1528 | 134.2222 | 151 | 0.7403 |
190
- | 0.1528 | 136.0 | 153 | 0.7100 |
191
- | 0.1528 | 136.8889 | 154 | 0.7204 |
192
- | 0.1528 | 137.7778 | 155 | 0.7487 |
193
- | 0.1528 | 138.6667 | 156 | 0.7436 |
194
- | 0.1528 | 139.5556 | 157 | 0.7147 |
195
- | 0.1528 | 140.4444 | 158 | 0.7007 |
196
- | 0.1528 | 141.3333 | 159 | 0.7200 |
197
- | 0.1445 | 142.2222 | 160 | 0.7436 |
198
- | 0.1445 | 144.0 | 162 | 0.7388 |
199
- | 0.1445 | 144.8889 | 163 | 0.7253 |
200
- | 0.1445 | 145.7778 | 164 | 0.7192 |
201
- | 0.1445 | 146.6667 | 165 | 0.7497 |
202
- | 0.1445 | 147.5556 | 166 | 0.7578 |
203
- | 0.1445 | 148.4444 | 167 | 0.7397 |
204
- | 0.1445 | 149.3333 | 168 | 0.7271 |
205
- | 0.1445 | 150.2222 | 169 | 0.7408 |
206
- | 0.1098 | 152.0 | 171 | 0.7728 |
207
- | 0.1098 | 152.8889 | 172 | 0.7810 |
208
- | 0.1098 | 153.7778 | 173 | 0.7824 |
209
- | 0.1098 | 154.6667 | 174 | 0.7713 |
210
- | 0.1098 | 155.5556 | 175 | 0.7721 |
211
- | 0.1098 | 156.4444 | 176 | 0.7786 |
212
- | 0.1098 | 157.3333 | 177 | 0.7790 |
213
- | 0.1098 | 158.2222 | 178 | 0.7894 |
214
- | 0.0947 | 160.0 | 180 | 0.7661 |
215
- | 0.0947 | 160.8889 | 181 | 0.7653 |
216
- | 0.0947 | 161.7778 | 182 | 0.7622 |
217
- | 0.0947 | 162.6667 | 183 | 0.7854 |
218
- | 0.0947 | 163.5556 | 184 | 0.8107 |
219
- | 0.0947 | 164.4444 | 185 | 0.7961 |
220
- | 0.0947 | 165.3333 | 186 | 0.7705 |
221
- | 0.0947 | 166.2222 | 187 | 0.7788 |
222
- | 0.0947 | 168.0 | 189 | 0.8065 |
223
- | 0.0804 | 168.8889 | 190 | 0.8074 |
224
- | 0.0804 | 169.7778 | 191 | 0.8307 |
225
- | 0.0804 | 170.6667 | 192 | 0.8483 |
226
- | 0.0804 | 171.5556 | 193 | 0.8407 |
227
- | 0.0804 | 172.4444 | 194 | 0.8203 |
228
- | 0.0804 | 173.3333 | 195 | 0.8132 |
229
- | 0.0804 | 174.2222 | 196 | 0.8333 |
230
- | 0.0804 | 176.0 | 198 | 0.8480 |
231
- | 0.0804 | 176.8889 | 199 | 0.8256 |
232
- | 0.0713 | 177.7778 | 200 | 0.7958 |
233
- | 0.0713 | 178.6667 | 201 | 0.7900 |
234
- | 0.0713 | 179.5556 | 202 | 0.8001 |
235
- | 0.0713 | 180.4444 | 203 | 0.8213 |
236
- | 0.0713 | 181.3333 | 204 | 0.8232 |
237
- | 0.0713 | 182.2222 | 205 | 0.8202 |
238
- | 0.0713 | 184.0 | 207 | 0.8162 |
239
- | 0.0713 | 184.8889 | 208 | 0.8214 |
240
- | 0.0713 | 185.7778 | 209 | 0.8244 |
241
- | 0.0598 | 186.6667 | 210 | 0.8298 |
242
- | 0.0598 | 187.5556 | 211 | 0.8315 |
243
- | 0.0598 | 188.4444 | 212 | 0.8231 |
244
- | 0.0598 | 189.3333 | 213 | 0.8215 |
245
- | 0.0598 | 190.2222 | 214 | 0.8178 |
246
- | 0.0598 | 192.0 | 216 | 0.8490 |
247
- | 0.0598 | 192.8889 | 217 | 0.8594 |
248
- | 0.0598 | 193.7778 | 218 | 0.8632 |
249
- | 0.0598 | 194.6667 | 219 | 0.8533 |
250
- | 0.0576 | 195.5556 | 220 | 0.8335 |
251
- | 0.0576 | 196.4444 | 221 | 0.8203 |
252
- | 0.0576 | 197.3333 | 222 | 0.8244 |
253
- | 0.0576 | 198.2222 | 223 | 0.8365 |
254
- | 0.0576 | 200.0 | 225 | 0.8829 |
255
- | 0.0576 | 200.8889 | 226 | 0.8803 |
256
- | 0.0576 | 201.7778 | 227 | 0.8677 |
257
- | 0.0576 | 202.6667 | 228 | 0.8452 |
258
- | 0.0576 | 203.5556 | 229 | 0.8291 |
259
- | 0.0547 | 204.4444 | 230 | 0.8296 |
260
- | 0.0547 | 205.3333 | 231 | 0.8411 |
261
- | 0.0547 | 206.2222 | 232 | 0.8607 |
262
- | 0.0547 | 208.0 | 234 | 0.8605 |
263
- | 0.0547 | 208.8889 | 235 | 0.8448 |
264
- | 0.0547 | 209.7778 | 236 | 0.8358 |
265
- | 0.0547 | 210.6667 | 237 | 0.8286 |
266
- | 0.0547 | 211.5556 | 238 | 0.8317 |
267
- | 0.0547 | 212.4444 | 239 | 0.8359 |
268
- | 0.0541 | 213.3333 | 240 | 0.8483 |
269
- | 0.0541 | 214.2222 | 241 | 0.8562 |
270
- | 0.0541 | 216.0 | 243 | 0.8583 |
271
- | 0.0541 | 216.8889 | 244 | 0.8513 |
272
- | 0.0541 | 217.7778 | 245 | 0.8408 |
273
- | 0.0541 | 218.6667 | 246 | 0.8386 |
274
- | 0.0541 | 219.5556 | 247 | 0.8467 |
275
- | 0.0541 | 220.4444 | 248 | 0.8554 |
276
- | 0.0541 | 221.3333 | 249 | 0.8669 |
277
- | 0.0433 | 222.2222 | 250 | 0.8775 |
278
- | 0.0433 | 224.0 | 252 | 0.8921 |
279
- | 0.0433 | 224.8889 | 253 | 0.8921 |
280
- | 0.0433 | 225.7778 | 254 | 0.8863 |
281
- | 0.0433 | 226.6667 | 255 | 0.8807 |
282
- | 0.0433 | 227.5556 | 256 | 0.8755 |
283
- | 0.0433 | 228.4444 | 257 | 0.8676 |
284
- | 0.0433 | 229.3333 | 258 | 0.8691 |
285
- | 0.0433 | 230.2222 | 259 | 0.8750 |
286
- | 0.036 | 232.0 | 261 | 0.8816 |
287
- | 0.036 | 232.8889 | 262 | 0.8840 |
288
- | 0.036 | 233.7778 | 263 | 0.8840 |
289
- | 0.036 | 234.6667 | 264 | 0.8863 |
290
- | 0.036 | 235.5556 | 265 | 0.8866 |
291
- | 0.036 | 236.4444 | 266 | 0.8830 |
292
- | 0.036 | 237.3333 | 267 | 0.8803 |
293
- | 0.036 | 238.2222 | 268 | 0.8742 |
294
- | 0.0412 | 240.0 | 270 | 0.8606 |
295
- | 0.0412 | 240.8889 | 271 | 0.8563 |
296
- | 0.0412 | 241.7778 | 272 | 0.8519 |
297
- | 0.0412 | 242.6667 | 273 | 0.8497 |
298
- | 0.0412 | 243.5556 | 274 | 0.8556 |
299
- | 0.0412 | 244.4444 | 275 | 0.8649 |
300
- | 0.0412 | 245.3333 | 276 | 0.8743 |
301
- | 0.0412 | 246.2222 | 277 | 0.8818 |
302
- | 0.0412 | 248.0 | 279 | 0.8993 |
303
- | 0.0328 | 248.8889 | 280 | 0.9036 |
304
- | 0.0328 | 249.7778 | 281 | 0.9028 |
305
- | 0.0328 | 250.6667 | 282 | 0.9015 |
306
- | 0.0328 | 251.5556 | 283 | 0.8978 |
307
- | 0.0328 | 252.4444 | 284 | 0.8949 |
308
- | 0.0328 | 253.3333 | 285 | 0.8934 |
309
- | 0.0328 | 254.2222 | 286 | 0.8971 |
310
- | 0.0328 | 256.0 | 288 | 0.9101 |
311
- | 0.0328 | 256.8889 | 289 | 0.9150 |
312
- | 0.032 | 257.7778 | 290 | 0.9129 |
313
- | 0.032 | 258.6667 | 291 | 0.9011 |
314
- | 0.032 | 259.5556 | 292 | 0.8880 |
315
- | 0.032 | 260.4444 | 293 | 0.8809 |
316
- | 0.032 | 261.3333 | 294 | 0.8785 |
317
- | 0.032 | 262.2222 | 295 | 0.8831 |
318
- | 0.032 | 264.0 | 297 | 0.9098 |
319
- | 0.032 | 264.8889 | 298 | 0.9231 |
320
- | 0.032 | 265.7778 | 299 | 0.9319 |
321
- | 0.0355 | 266.6667 | 300 | 0.9332 |
322
- | 0.0355 | 267.5556 | 301 | 0.9220 |
323
- | 0.0355 | 268.4444 | 302 | 0.9096 |
324
- | 0.0355 | 269.3333 | 303 | 0.8954 |
325
- | 0.0355 | 270.2222 | 304 | 0.8855 |
326
- | 0.0355 | 272.0 | 306 | 0.8827 |
327
- | 0.0355 | 272.8889 | 307 | 0.8854 |
328
- | 0.0355 | 273.7778 | 308 | 0.8918 |
329
- | 0.0355 | 274.6667 | 309 | 0.8986 |
330
- | 0.028 | 275.5556 | 310 | 0.9039 |
331
- | 0.028 | 276.4444 | 311 | 0.9092 |
332
- | 0.028 | 277.3333 | 312 | 0.9120 |
333
- | 0.028 | 278.2222 | 313 | 0.9134 |
334
- | 0.028 | 280.0 | 315 | 0.9133 |
335
- | 0.028 | 280.8889 | 316 | 0.9126 |
336
- | 0.028 | 281.7778 | 317 | 0.9129 |
337
- | 0.028 | 282.6667 | 318 | 0.9117 |
338
- | 0.028 | 283.5556 | 319 | 0.9118 |
339
- | 0.0246 | 284.4444 | 320 | 0.9119 |
340
- | 0.0246 | 285.3333 | 321 | 0.9125 |
341
- | 0.0246 | 286.2222 | 322 | 0.9148 |
342
- | 0.0246 | 288.0 | 324 | 0.9197 |
343
- | 0.0246 | 288.8889 | 325 | 0.9226 |
344
- | 0.0246 | 289.7778 | 326 | 0.9253 |
345
- | 0.0246 | 290.6667 | 327 | 0.9283 |
346
- | 0.0246 | 291.5556 | 328 | 0.9317 |
347
- | 0.0246 | 292.4444 | 329 | 0.9320 |
348
- | 0.0224 | 293.3333 | 330 | 0.9296 |
349
- | 0.0224 | 294.2222 | 331 | 0.9253 |
350
- | 0.0224 | 296.0 | 333 | 0.9176 |
351
- | 0.0224 | 296.8889 | 334 | 0.9151 |
352
- | 0.0224 | 297.7778 | 335 | 0.9135 |
353
- | 0.0224 | 298.6667 | 336 | 0.9127 |
354
- | 0.0224 | 299.5556 | 337 | 0.9118 |
355
- | 0.0224 | 300.4444 | 338 | 0.9130 |
356
- | 0.0224 | 301.3333 | 339 | 0.9156 |
357
- | 0.0219 | 302.2222 | 340 | 0.9185 |
358
- | 0.0219 | 304.0 | 342 | 0.9268 |
359
- | 0.0219 | 304.8889 | 343 | 0.9301 |
360
- | 0.0219 | 305.7778 | 344 | 0.9331 |
361
- | 0.0219 | 306.6667 | 345 | 0.9337 |
362
- | 0.0219 | 307.5556 | 346 | 0.9340 |
363
- | 0.0219 | 308.4444 | 347 | 0.9340 |
364
- | 0.0219 | 309.3333 | 348 | 0.9337 |
365
- | 0.0219 | 310.2222 | 349 | 0.9341 |
366
- | 0.0233 | 312.0 | 351 | 0.9320 |
367
- | 0.0233 | 312.8889 | 352 | 0.9305 |
368
- | 0.0233 | 313.7778 | 353 | 0.9270 |
369
- | 0.0233 | 314.6667 | 354 | 0.9241 |
370
- | 0.0233 | 315.5556 | 355 | 0.9229 |
371
- | 0.0233 | 316.4444 | 356 | 0.9230 |
372
- | 0.0233 | 317.3333 | 357 | 0.9241 |
373
- | 0.0233 | 318.2222 | 358 | 0.9257 |
374
- | 0.0239 | 320.0 | 360 | 0.9313 |
375
- | 0.0239 | 320.8889 | 361 | 0.9336 |
376
- | 0.0239 | 321.7778 | 362 | 0.9359 |
377
- | 0.0239 | 322.6667 | 363 | 0.9380 |
378
- | 0.0239 | 323.5556 | 364 | 0.9395 |
379
- | 0.0239 | 324.4444 | 365 | 0.9405 |
380
- | 0.0239 | 325.3333 | 366 | 0.9423 |
381
- | 0.0239 | 326.2222 | 367 | 0.9438 |
382
- | 0.0239 | 328.0 | 369 | 0.9453 |
383
- | 0.0208 | 328.8889 | 370 | 0.9458 |
384
- | 0.0208 | 329.7778 | 371 | 0.9451 |
385
- | 0.0208 | 330.6667 | 372 | 0.9441 |
386
- | 0.0208 | 331.5556 | 373 | 0.9430 |
387
- | 0.0208 | 332.4444 | 374 | 0.9420 |
388
- | 0.0208 | 333.3333 | 375 | 0.9410 |
389
- | 0.0208 | 334.2222 | 376 | 0.9404 |
390
- | 0.0208 | 336.0 | 378 | 0.9422 |
391
- | 0.0208 | 336.8889 | 379 | 0.9430 |
392
- | 0.0219 | 337.7778 | 380 | 0.9443 |
393
- | 0.0219 | 338.6667 | 381 | 0.9449 |
394
- | 0.0219 | 339.5556 | 382 | 0.9456 |
395
- | 0.0219 | 340.4444 | 383 | 0.9465 |
396
- | 0.0219 | 341.3333 | 384 | 0.9467 |
397
- | 0.0219 | 342.2222 | 385 | 0.9476 |
398
- | 0.0219 | 344.0 | 387 | 0.9504 |
399
- | 0.0219 | 344.8889 | 388 | 0.9509 |
400
- | 0.0219 | 345.7778 | 389 | 0.9520 |
401
- | 0.0222 | 346.6667 | 390 | 0.9518 |
402
- | 0.0222 | 347.5556 | 391 | 0.9519 |
403
- | 0.0222 | 348.4444 | 392 | 0.9527 |
404
- | 0.0222 | 349.3333 | 393 | 0.9523 |
405
- | 0.0222 | 350.2222 | 394 | 0.9522 |
406
- | 0.0222 | 352.0 | 396 | 0.9508 |
407
- | 0.0222 | 352.8889 | 397 | 0.9498 |
408
- | 0.0222 | 353.7778 | 398 | 0.9495 |
409
- | 0.0222 | 354.6667 | 399 | 0.9487 |
410
- | 0.0211 | 355.5556 | 400 | 0.9481 |
411
- | 0.0211 | 356.4444 | 401 | 0.9470 |
412
- | 0.0211 | 357.3333 | 402 | 0.9456 |
413
- | 0.0211 | 358.2222 | 403 | 0.9444 |
414
- | 0.0211 | 360.0 | 405 | 0.9424 |
415
- | 0.0211 | 360.8889 | 406 | 0.9419 |
416
- | 0.0211 | 361.7778 | 407 | 0.9408 |
417
- | 0.0211 | 362.6667 | 408 | 0.9406 |
418
- | 0.0211 | 363.5556 | 409 | 0.9403 |
419
- | 0.0213 | 364.4444 | 410 | 0.9398 |
420
- | 0.0213 | 365.3333 | 411 | 0.9401 |
421
- | 0.0213 | 366.2222 | 412 | 0.9400 |
422
- | 0.0213 | 368.0 | 414 | 0.9410 |
423
- | 0.0213 | 368.8889 | 415 | 0.9403 |
424
- | 0.0213 | 369.7778 | 416 | 0.9413 |
425
- | 0.0213 | 370.6667 | 417 | 0.9418 |
426
- | 0.0213 | 371.5556 | 418 | 0.9428 |
427
- | 0.0213 | 372.4444 | 419 | 0.9440 |
428
- | 0.0195 | 373.3333 | 420 | 0.9448 |
429
- | 0.0195 | 374.2222 | 421 | 0.9456 |
430
- | 0.0195 | 376.0 | 423 | 0.9476 |
431
- | 0.0195 | 376.8889 | 424 | 0.9489 |
432
- | 0.0195 | 377.7778 | 425 | 0.9499 |
433
- | 0.0195 | 378.6667 | 426 | 0.9511 |
434
- | 0.0195 | 379.5556 | 427 | 0.9520 |
435
- | 0.0195 | 380.4444 | 428 | 0.9532 |
436
- | 0.0195 | 381.3333 | 429 | 0.9541 |
437
- | 0.02 | 382.2222 | 430 | 0.9553 |
438
- | 0.02 | 384.0 | 432 | 0.9567 |
439
- | 0.02 | 384.8889 | 433 | 0.9575 |
440
- | 0.02 | 385.7778 | 434 | 0.9587 |
441
- | 0.02 | 386.6667 | 435 | 0.9590 |
442
- | 0.02 | 387.5556 | 436 | 0.9592 |
443
- | 0.02 | 388.4444 | 437 | 0.9596 |
444
- | 0.02 | 389.3333 | 438 | 0.9600 |
445
- | 0.02 | 390.2222 | 439 | 0.9607 |
446
- | 0.0183 | 392.0 | 441 | 0.9616 |
447
- | 0.0183 | 392.8889 | 442 | 0.9620 |
448
- | 0.0183 | 393.7778 | 443 | 0.9616 |
449
- | 0.0183 | 394.6667 | 444 | 0.9615 |
450
- | 0.0183 | 395.5556 | 445 | 0.9615 |
451
- | 0.0183 | 396.4444 | 446 | 0.9615 |
452
- | 0.0183 | 397.3333 | 447 | 0.9613 |
453
- | 0.0183 | 398.2222 | 448 | 0.9609 |
454
- | 0.0193 | 400.0 | 450 | 0.9613 |
455
- | 0.0193 | 400.8889 | 451 | 0.9607 |
456
- | 0.0193 | 401.7778 | 452 | 0.9604 |
457
- | 0.0193 | 402.6667 | 453 | 0.9598 |
458
- | 0.0193 | 403.5556 | 454 | 0.9594 |
459
- | 0.0193 | 404.4444 | 455 | 0.9593 |
460
- | 0.0193 | 405.3333 | 456 | 0.9593 |
461
- | 0.0193 | 406.2222 | 457 | 0.9588 |
462
- | 0.0193 | 408.0 | 459 | 0.9587 |
463
- | 0.0177 | 408.8889 | 460 | 0.9584 |
464
- | 0.0177 | 409.7778 | 461 | 0.9590 |
465
- | 0.0177 | 410.6667 | 462 | 0.9584 |
466
- | 0.0177 | 411.5556 | 463 | 0.9582 |
467
- | 0.0177 | 412.4444 | 464 | 0.9586 |
468
- | 0.0177 | 413.3333 | 465 | 0.9586 |
469
- | 0.0177 | 414.2222 | 466 | 0.9585 |
470
- | 0.0177 | 416.0 | 468 | 0.9585 |
471
- | 0.0177 | 416.8889 | 469 | 0.9588 |
472
- | 0.0184 | 417.7778 | 470 | 0.9582 |
473
- | 0.0184 | 418.6667 | 471 | 0.9586 |
474
- | 0.0184 | 419.5556 | 472 | 0.9590 |
475
- | 0.0184 | 420.4444 | 473 | 0.9588 |
476
- | 0.0184 | 421.3333 | 474 | 0.9585 |
477
- | 0.0184 | 422.2222 | 475 | 0.9586 |
478
- | 0.0184 | 424.0 | 477 | 0.9591 |
479
- | 0.0184 | 424.8889 | 478 | 0.9586 |
480
- | 0.0184 | 425.7778 | 479 | 0.9590 |
481
- | 0.0182 | 426.6667 | 480 | 0.9591 |
482
- | 0.0182 | 427.5556 | 481 | 0.9587 |
483
- | 0.0182 | 428.4444 | 482 | 0.9587 |
484
- | 0.0182 | 429.3333 | 483 | 0.9583 |
485
- | 0.0182 | 430.2222 | 484 | 0.9588 |
486
- | 0.0182 | 432.0 | 486 | 0.9585 |
487
- | 0.0182 | 432.8889 | 487 | 0.9586 |
488
- | 0.0182 | 433.7778 | 488 | 0.9586 |
489
- | 0.0182 | 434.6667 | 489 | 0.9585 |
490
- | 0.0193 | 435.5556 | 490 | 0.9583 |
491
- | 0.0193 | 436.4444 | 491 | 0.9590 |
492
- | 0.0193 | 437.3333 | 492 | 0.9584 |
493
- | 0.0193 | 438.2222 | 493 | 0.9583 |
494
- | 0.0193 | 440.0 | 495 | 0.9585 |
495
- | 0.0193 | 440.8889 | 496 | 0.9585 |
496
- | 0.0193 | 441.7778 | 497 | 0.9588 |
497
- | 0.0193 | 442.6667 | 498 | 0.9584 |
498
- | 0.0193 | 443.5556 | 499 | 0.9586 |
499
- | 0.0185 | 444.4444 | 500 | 0.9587 |
500
 
501
 
502
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.5224
22
 
23
  ## Model description
24
 
 
38
 
39
  The following hyperparameters were used during training:
40
  - learning_rate: 0.0002
41
+ - train_batch_size: 6
42
  - eval_batch_size: 8
43
  - seed: 42
44
+ - gradient_accumulation_steps: 6
45
+ - total_train_batch_size: 36
46
  - optimizer: Use paged_adamw_32bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
47
  - lr_scheduler_type: cosine
48
+ - training_steps: 100
49
  - mixed_precision_training: Native AMP
50
 
51
  ### Training results
52
 
53
+ | Training Loss | Epoch | Step | Validation Loss |
54
+ |:-------------:|:-----:|:----:|:---------------:|
55
+ | No log | 1.0 | 2 | 1.1349 |
56
+ | No log | 2.0 | 4 | 1.1026 |
57
+ | No log | 3.0 | 6 | 1.0619 |
58
+ | No log | 4.0 | 8 | 1.0198 |
59
+ | 1.3117 | 5.0 | 10 | 0.9727 |
60
+ | 1.3117 | 6.0 | 12 | 0.9251 |
61
+ | 1.3117 | 7.0 | 14 | 0.8814 |
62
+ | 1.3117 | 8.0 | 16 | 0.8433 |
63
+ | 1.3117 | 9.0 | 18 | 0.8134 |
64
+ | 1.0107 | 10.0 | 20 | 0.7857 |
65
+ | 1.0107 | 11.0 | 22 | 0.7571 |
66
+ | 1.0107 | 12.0 | 24 | 0.7359 |
67
+ | 1.0107 | 13.0 | 26 | 0.7097 |
68
+ | 1.0107 | 14.0 | 28 | 0.6873 |
69
+ | 0.8097 | 15.0 | 30 | 0.6693 |
70
+ | 0.8097 | 16.0 | 32 | 0.6487 |
71
+ | 0.8097 | 17.0 | 34 | 0.6325 |
72
+ | 0.8097 | 18.0 | 36 | 0.6193 |
73
+ | 0.8097 | 19.0 | 38 | 0.6013 |
74
+ | 0.6699 | 20.0 | 40 | 0.5934 |
75
+ | 0.6699 | 21.0 | 42 | 0.5781 |
76
+ | 0.6699 | 22.0 | 44 | 0.5656 |
77
+ | 0.6699 | 23.0 | 46 | 0.5601 |
78
+ | 0.6699 | 24.0 | 48 | 0.5424 |
79
+ | 0.5545 | 25.0 | 50 | 0.5361 |
80
+ | 0.5545 | 26.0 | 52 | 0.5246 |
81
+ | 0.5545 | 27.0 | 54 | 0.5220 |
82
+ | 0.5545 | 28.0 | 56 | 0.5176 |
83
+ | 0.5545 | 29.0 | 58 | 0.5143 |
84
+ | 0.4559 | 30.0 | 60 | 0.5167 |
85
+ | 0.4559 | 31.0 | 62 | 0.5079 |
86
+ | 0.4559 | 32.0 | 64 | 0.5148 |
87
+ | 0.4559 | 33.0 | 66 | 0.5128 |
88
+ | 0.4559 | 34.0 | 68 | 0.5095 |
89
+ | 0.3905 | 35.0 | 70 | 0.5110 |
90
+ | 0.3905 | 36.0 | 72 | 0.5153 |
91
+ | 0.3905 | 37.0 | 74 | 0.5135 |
92
+ | 0.3905 | 38.0 | 76 | 0.5136 |
93
+ | 0.3905 | 39.0 | 78 | 0.5196 |
94
+ | 0.3472 | 40.0 | 80 | 0.5182 |
95
+ | 0.3472 | 41.0 | 82 | 0.5164 |
96
+ | 0.3472 | 42.0 | 84 | 0.5180 |
97
+ | 0.3472 | 43.0 | 86 | 0.5214 |
98
+ | 0.3472 | 44.0 | 88 | 0.5235 |
99
+ | 0.328 | 45.0 | 90 | 0.5234 |
100
+ | 0.328 | 46.0 | 92 | 0.5228 |
101
+ | 0.328 | 47.0 | 94 | 0.5224 |
102
+ | 0.328 | 48.0 | 96 | 0.5225 |
103
+ | 0.328 | 49.0 | 98 | 0.5225 |
104
+ | 0.3182 | 50.0 | 100 | 0.5224 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
105
 
106
 
107
  ### Framework versions
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:90a2dea0208abfe136f0391f54bc4777fe30e9c0a6c4579f62c2cc46d80a686e
3
  size 13648432
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:79cafcafb9ad0d4be9399770923dfac2b1aa3db203d956847af13edffcd80636
3
  size 13648432
runs/Oct29_09-41-35_pest/events.out.tfevents.1730194896.pest.1.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:06d1dc35f2d3624e7470ee420229b3a674b4beb002e6f88682cbad789325614c
3
- size 21040
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:59f6200355543d0e768f976d7e2513a777c0004bbcc51a47cd266762b6d8f4af
3
+ size 21861