augustocsc commited on
Commit
1637bbd
·
verified ·
1 Parent(s): 98e3af0

Training in progress, epoch 1

Browse files
README.md ADDED
@@ -0,0 +1,427 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: peft
3
+ license: mit
4
+ base_model: gpt2
5
+ tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: Se124M100KInfPrompt_NT
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # Se124M100KInfPrompt_NT
16
+
17
+ This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.3899
20
+
21
+ ## Model description
22
+
23
+ More information needed
24
+
25
+ ## Intended uses & limitations
26
+
27
+ More information needed
28
+
29
+ ## Training and evaluation data
30
+
31
+ More information needed
32
+
33
+ ## Training procedure
34
+
35
+ ### Training hyperparameters
36
+
37
+ The following hyperparameters were used during training:
38
+ - learning_rate: 2e-05
39
+ - train_batch_size: 16
40
+ - eval_batch_size: 16
41
+ - seed: 42
42
+ - gradient_accumulation_steps: 2
43
+ - total_train_batch_size: 32
44
+ - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
45
+ - lr_scheduler_type: cosine
46
+ - lr_scheduler_warmup_ratio: 0.03
47
+ - num_epochs: 3
48
+
49
+ ### Training results
50
+
51
+ | Training Loss | Epoch | Step | Validation Loss |
52
+ |:-------------:|:------:|:----:|:---------------:|
53
+ | 2.9983 | 0.0082 | 20 | 2.6302 |
54
+ | 2.9256 | 0.0164 | 40 | 2.6331 |
55
+ | 2.9534 | 0.0246 | 60 | 2.6305 |
56
+ | 2.9277 | 0.0327 | 80 | 2.6052 |
57
+ | 2.8694 | 0.0409 | 100 | 2.5836 |
58
+ | 2.879 | 0.0491 | 120 | 2.5278 |
59
+ | 2.7972 | 0.0573 | 140 | 2.4722 |
60
+ | 2.7112 | 0.0655 | 160 | 2.4048 |
61
+ | 2.5739 | 0.0737 | 180 | 2.3244 |
62
+ | 2.4522 | 0.0819 | 200 | 2.2167 |
63
+ | 2.3121 | 0.0901 | 220 | 2.0842 |
64
+ | 2.1652 | 0.0982 | 240 | 1.9278 |
65
+ | 2.0135 | 0.1064 | 260 | 1.7658 |
66
+ | 1.8352 | 0.1146 | 280 | 1.5877 |
67
+ | 1.6331 | 0.1228 | 300 | 1.3988 |
68
+ | 1.4721 | 0.1310 | 320 | 1.2257 |
69
+ | 1.3347 | 0.1392 | 340 | 1.0901 |
70
+ | 1.202 | 0.1474 | 360 | 0.9639 |
71
+ | 1.125 | 0.1555 | 380 | 0.8691 |
72
+ | 1.002 | 0.1637 | 400 | 0.8003 |
73
+ | 0.9698 | 0.1719 | 420 | 0.7525 |
74
+ | 0.8963 | 0.1801 | 440 | 0.7148 |
75
+ | 0.8571 | 0.1883 | 460 | 0.6803 |
76
+ | 0.7983 | 0.1965 | 480 | 0.6542 |
77
+ | 0.7838 | 0.2047 | 500 | 0.6332 |
78
+ | 0.7689 | 0.2129 | 520 | 0.6118 |
79
+ | 0.7256 | 0.2210 | 540 | 0.5931 |
80
+ | 0.7146 | 0.2292 | 560 | 0.5799 |
81
+ | 0.686 | 0.2374 | 580 | 0.5673 |
82
+ | 0.6729 | 0.2456 | 600 | 0.5565 |
83
+ | 0.6628 | 0.2538 | 620 | 0.5445 |
84
+ | 0.6525 | 0.2620 | 640 | 0.5406 |
85
+ | 0.6298 | 0.2702 | 660 | 0.5328 |
86
+ | 0.6345 | 0.2783 | 680 | 0.5237 |
87
+ | 0.6171 | 0.2865 | 700 | 0.5169 |
88
+ | 0.6052 | 0.2947 | 720 | 0.5113 |
89
+ | 0.5862 | 0.3029 | 740 | 0.5066 |
90
+ | 0.5767 | 0.3111 | 760 | 0.5021 |
91
+ | 0.5777 | 0.3193 | 780 | 0.4966 |
92
+ | 0.5689 | 0.3275 | 800 | 0.4939 |
93
+ | 0.5677 | 0.3357 | 820 | 0.4894 |
94
+ | 0.5567 | 0.3438 | 840 | 0.4878 |
95
+ | 0.5547 | 0.3520 | 860 | 0.4817 |
96
+ | 0.5516 | 0.3602 | 880 | 0.4808 |
97
+ | 0.5577 | 0.3684 | 900 | 0.4787 |
98
+ | 0.5461 | 0.3766 | 920 | 0.4740 |
99
+ | 0.5449 | 0.3848 | 940 | 0.4712 |
100
+ | 0.5301 | 0.3930 | 960 | 0.4711 |
101
+ | 0.5313 | 0.4011 | 980 | 0.4682 |
102
+ | 0.5278 | 0.4093 | 1000 | 0.4676 |
103
+ | 0.518 | 0.4175 | 1020 | 0.4643 |
104
+ | 0.531 | 0.4257 | 1040 | 0.4621 |
105
+ | 0.5302 | 0.4339 | 1060 | 0.4624 |
106
+ | 0.5238 | 0.4421 | 1080 | 0.4581 |
107
+ | 0.5179 | 0.4503 | 1100 | 0.4572 |
108
+ | 0.5167 | 0.4585 | 1120 | 0.4577 |
109
+ | 0.5181 | 0.4666 | 1140 | 0.4534 |
110
+ | 0.5207 | 0.4748 | 1160 | 0.4536 |
111
+ | 0.5037 | 0.4830 | 1180 | 0.4533 |
112
+ | 0.5117 | 0.4912 | 1200 | 0.4517 |
113
+ | 0.5066 | 0.4994 | 1220 | 0.4500 |
114
+ | 0.5023 | 0.5076 | 1240 | 0.4487 |
115
+ | 0.4903 | 0.5158 | 1260 | 0.4470 |
116
+ | 0.4916 | 0.5239 | 1280 | 0.4462 |
117
+ | 0.4908 | 0.5321 | 1300 | 0.4460 |
118
+ | 0.4956 | 0.5403 | 1320 | 0.4443 |
119
+ | 0.5059 | 0.5485 | 1340 | 0.4438 |
120
+ | 0.4908 | 0.5567 | 1360 | 0.4427 |
121
+ | 0.4978 | 0.5649 | 1380 | 0.4416 |
122
+ | 0.4861 | 0.5731 | 1400 | 0.4410 |
123
+ | 0.4865 | 0.5813 | 1420 | 0.4404 |
124
+ | 0.4916 | 0.5894 | 1440 | 0.4381 |
125
+ | 0.4832 | 0.5976 | 1460 | 0.4352 |
126
+ | 0.4811 | 0.6058 | 1480 | 0.4381 |
127
+ | 0.4779 | 0.6140 | 1500 | 0.4364 |
128
+ | 0.4792 | 0.6222 | 1520 | 0.4381 |
129
+ | 0.4755 | 0.6304 | 1540 | 0.4346 |
130
+ | 0.4797 | 0.6386 | 1560 | 0.4358 |
131
+ | 0.4769 | 0.6467 | 1580 | 0.4321 |
132
+ | 0.4682 | 0.6549 | 1600 | 0.4323 |
133
+ | 0.4797 | 0.6631 | 1620 | 0.4338 |
134
+ | 0.4754 | 0.6713 | 1640 | 0.4332 |
135
+ | 0.4687 | 0.6795 | 1660 | 0.4325 |
136
+ | 0.4629 | 0.6877 | 1680 | 0.4330 |
137
+ | 0.478 | 0.6959 | 1700 | 0.4312 |
138
+ | 0.4693 | 0.7041 | 1720 | 0.4291 |
139
+ | 0.4746 | 0.7122 | 1740 | 0.4305 |
140
+ | 0.4626 | 0.7204 | 1760 | 0.4300 |
141
+ | 0.4641 | 0.7286 | 1780 | 0.4317 |
142
+ | 0.4606 | 0.7368 | 1800 | 0.4287 |
143
+ | 0.4678 | 0.7450 | 1820 | 0.4278 |
144
+ | 0.4736 | 0.7532 | 1840 | 0.4267 |
145
+ | 0.4739 | 0.7614 | 1860 | 0.4270 |
146
+ | 0.4627 | 0.7695 | 1880 | 0.4269 |
147
+ | 0.4596 | 0.7777 | 1900 | 0.4247 |
148
+ | 0.4617 | 0.7859 | 1920 | 0.4245 |
149
+ | 0.4663 | 0.7941 | 1940 | 0.4238 |
150
+ | 0.4569 | 0.8023 | 1960 | 0.4243 |
151
+ | 0.4683 | 0.8105 | 1980 | 0.4229 |
152
+ | 0.4664 | 0.8187 | 2000 | 0.4231 |
153
+ | 0.4711 | 0.8269 | 2020 | 0.4203 |
154
+ | 0.4712 | 0.8350 | 2040 | 0.4201 |
155
+ | 0.4579 | 0.8432 | 2060 | 0.4186 |
156
+ | 0.4688 | 0.8514 | 2080 | 0.4221 |
157
+ | 0.4566 | 0.8596 | 2100 | 0.4222 |
158
+ | 0.4573 | 0.8678 | 2120 | 0.4179 |
159
+ | 0.4606 | 0.8760 | 2140 | 0.4183 |
160
+ | 0.456 | 0.8842 | 2160 | 0.4189 |
161
+ | 0.4684 | 0.8923 | 2180 | 0.4180 |
162
+ | 0.4522 | 0.9005 | 2200 | 0.4183 |
163
+ | 0.4591 | 0.9087 | 2220 | 0.4171 |
164
+ | 0.457 | 0.9169 | 2240 | 0.4194 |
165
+ | 0.4714 | 0.9251 | 2260 | 0.4160 |
166
+ | 0.4637 | 0.9333 | 2280 | 0.4173 |
167
+ | 0.4454 | 0.9415 | 2300 | 0.4190 |
168
+ | 0.4579 | 0.9497 | 2320 | 0.4133 |
169
+ | 0.4567 | 0.9578 | 2340 | 0.4153 |
170
+ | 0.4479 | 0.9660 | 2360 | 0.4152 |
171
+ | 0.4523 | 0.9742 | 2380 | 0.4138 |
172
+ | 0.4559 | 0.9824 | 2400 | 0.4147 |
173
+ | 0.4493 | 0.9906 | 2420 | 0.4131 |
174
+ | 0.4568 | 0.9988 | 2440 | 0.4145 |
175
+ | 0.4494 | 1.0070 | 2460 | 0.4120 |
176
+ | 0.4549 | 1.0151 | 2480 | 0.4120 |
177
+ | 0.4491 | 1.0233 | 2500 | 0.4130 |
178
+ | 0.454 | 1.0315 | 2520 | 0.4143 |
179
+ | 0.4474 | 1.0397 | 2540 | 0.4134 |
180
+ | 0.4541 | 1.0479 | 2560 | 0.4134 |
181
+ | 0.4458 | 1.0561 | 2580 | 0.4117 |
182
+ | 0.4469 | 1.0643 | 2600 | 0.4108 |
183
+ | 0.4502 | 1.0725 | 2620 | 0.4120 |
184
+ | 0.4447 | 1.0806 | 2640 | 0.4102 |
185
+ | 0.445 | 1.0888 | 2660 | 0.4107 |
186
+ | 0.4496 | 1.0970 | 2680 | 0.4080 |
187
+ | 0.445 | 1.1052 | 2700 | 0.4097 |
188
+ | 0.4549 | 1.1134 | 2720 | 0.4071 |
189
+ | 0.4476 | 1.1216 | 2740 | 0.4095 |
190
+ | 0.4427 | 1.1298 | 2760 | 0.4111 |
191
+ | 0.4412 | 1.1379 | 2780 | 0.4091 |
192
+ | 0.441 | 1.1461 | 2800 | 0.4111 |
193
+ | 0.4465 | 1.1543 | 2820 | 0.4080 |
194
+ | 0.4427 | 1.1625 | 2840 | 0.4076 |
195
+ | 0.4417 | 1.1707 | 2860 | 0.4080 |
196
+ | 0.4409 | 1.1789 | 2880 | 0.4080 |
197
+ | 0.4573 | 1.1871 | 2900 | 0.4078 |
198
+ | 0.443 | 1.1953 | 2920 | 0.4067 |
199
+ | 0.4412 | 1.2034 | 2940 | 0.4079 |
200
+ | 0.4384 | 1.2116 | 2960 | 0.4079 |
201
+ | 0.4426 | 1.2198 | 2980 | 0.4083 |
202
+ | 0.4407 | 1.2280 | 3000 | 0.4056 |
203
+ | 0.4487 | 1.2362 | 3020 | 0.4059 |
204
+ | 0.4421 | 1.2444 | 3040 | 0.4064 |
205
+ | 0.4412 | 1.2526 | 3060 | 0.4057 |
206
+ | 0.4354 | 1.2607 | 3080 | 0.4073 |
207
+ | 0.4454 | 1.2689 | 3100 | 0.4056 |
208
+ | 0.4376 | 1.2771 | 3120 | 0.4064 |
209
+ | 0.4469 | 1.2853 | 3140 | 0.4043 |
210
+ | 0.4437 | 1.2935 | 3160 | 0.4038 |
211
+ | 0.4412 | 1.3017 | 3180 | 0.4031 |
212
+ | 0.4354 | 1.3099 | 3200 | 0.4053 |
213
+ | 0.4413 | 1.3181 | 3220 | 0.4050 |
214
+ | 0.4344 | 1.3262 | 3240 | 0.4048 |
215
+ | 0.4471 | 1.3344 | 3260 | 0.4022 |
216
+ | 0.4347 | 1.3426 | 3280 | 0.4049 |
217
+ | 0.4367 | 1.3508 | 3300 | 0.4019 |
218
+ | 0.4391 | 1.3590 | 3320 | 0.4033 |
219
+ | 0.4424 | 1.3672 | 3340 | 0.4019 |
220
+ | 0.4391 | 1.3754 | 3360 | 0.4009 |
221
+ | 0.4377 | 1.3835 | 3380 | 0.4014 |
222
+ | 0.4413 | 1.3917 | 3400 | 0.4015 |
223
+ | 0.4382 | 1.3999 | 3420 | 0.4006 |
224
+ | 0.4298 | 1.4081 | 3440 | 0.4015 |
225
+ | 0.4503 | 1.4163 | 3460 | 0.4019 |
226
+ | 0.4413 | 1.4245 | 3480 | 0.4015 |
227
+ | 0.4343 | 1.4327 | 3500 | 0.3996 |
228
+ | 0.4373 | 1.4409 | 3520 | 0.4002 |
229
+ | 0.4338 | 1.4490 | 3540 | 0.4016 |
230
+ | 0.4292 | 1.4572 | 3560 | 0.4000 |
231
+ | 0.4444 | 1.4654 | 3580 | 0.4004 |
232
+ | 0.4342 | 1.4736 | 3600 | 0.3996 |
233
+ | 0.4339 | 1.4818 | 3620 | 0.4004 |
234
+ | 0.4291 | 1.4900 | 3640 | 0.4006 |
235
+ | 0.435 | 1.4982 | 3660 | 0.3993 |
236
+ | 0.445 | 1.5063 | 3680 | 0.3999 |
237
+ | 0.4389 | 1.5145 | 3700 | 0.4009 |
238
+ | 0.4316 | 1.5227 | 3720 | 0.3988 |
239
+ | 0.4363 | 1.5309 | 3740 | 0.3994 |
240
+ | 0.4384 | 1.5391 | 3760 | 0.3995 |
241
+ | 0.4355 | 1.5473 | 3780 | 0.4006 |
242
+ | 0.436 | 1.5555 | 3800 | 0.3983 |
243
+ | 0.4384 | 1.5637 | 3820 | 0.3981 |
244
+ | 0.4394 | 1.5718 | 3840 | 0.3985 |
245
+ | 0.4392 | 1.5800 | 3860 | 0.3978 |
246
+ | 0.4456 | 1.5882 | 3880 | 0.3991 |
247
+ | 0.4359 | 1.5964 | 3900 | 0.3984 |
248
+ | 0.4328 | 1.6046 | 3920 | 0.4004 |
249
+ | 0.4272 | 1.6128 | 3940 | 0.3992 |
250
+ | 0.4352 | 1.6210 | 3960 | 0.3993 |
251
+ | 0.4262 | 1.6291 | 3980 | 0.3994 |
252
+ | 0.4406 | 1.6373 | 4000 | 0.3979 |
253
+ | 0.4291 | 1.6455 | 4020 | 0.3991 |
254
+ | 0.4262 | 1.6537 | 4040 | 0.3975 |
255
+ | 0.4337 | 1.6619 | 4060 | 0.3978 |
256
+ | 0.4404 | 1.6701 | 4080 | 0.3964 |
257
+ | 0.4408 | 1.6783 | 4100 | 0.3983 |
258
+ | 0.4378 | 1.6865 | 4120 | 0.3977 |
259
+ | 0.4322 | 1.6946 | 4140 | 0.3973 |
260
+ | 0.4343 | 1.7028 | 4160 | 0.3970 |
261
+ | 0.43 | 1.7110 | 4180 | 0.3961 |
262
+ | 0.4343 | 1.7192 | 4200 | 0.3958 |
263
+ | 0.4308 | 1.7274 | 4220 | 0.3965 |
264
+ | 0.4355 | 1.7356 | 4240 | 0.3952 |
265
+ | 0.4371 | 1.7438 | 4260 | 0.3966 |
266
+ | 0.4342 | 1.7519 | 4280 | 0.3956 |
267
+ | 0.4364 | 1.7601 | 4300 | 0.3962 |
268
+ | 0.434 | 1.7683 | 4320 | 0.3953 |
269
+ | 0.4335 | 1.7765 | 4340 | 0.3965 |
270
+ | 0.4317 | 1.7847 | 4360 | 0.3953 |
271
+ | 0.4298 | 1.7929 | 4380 | 0.3954 |
272
+ | 0.4307 | 1.8011 | 4400 | 0.3942 |
273
+ | 0.4345 | 1.8093 | 4420 | 0.3952 |
274
+ | 0.433 | 1.8174 | 4440 | 0.3943 |
275
+ | 0.4261 | 1.8256 | 4460 | 0.3955 |
276
+ | 0.4338 | 1.8338 | 4480 | 0.3950 |
277
+ | 0.4263 | 1.8420 | 4500 | 0.3944 |
278
+ | 0.4263 | 1.8502 | 4520 | 0.3939 |
279
+ | 0.436 | 1.8584 | 4540 | 0.3943 |
280
+ | 0.432 | 1.8666 | 4560 | 0.3946 |
281
+ | 0.4302 | 1.8747 | 4580 | 0.3942 |
282
+ | 0.4333 | 1.8829 | 4600 | 0.3936 |
283
+ | 0.4316 | 1.8911 | 4620 | 0.3936 |
284
+ | 0.4294 | 1.8993 | 4640 | 0.3938 |
285
+ | 0.4265 | 1.9075 | 4660 | 0.3936 |
286
+ | 0.4294 | 1.9157 | 4680 | 0.3943 |
287
+ | 0.4319 | 1.9239 | 4700 | 0.3942 |
288
+ | 0.4391 | 1.9321 | 4720 | 0.3933 |
289
+ | 0.4243 | 1.9402 | 4740 | 0.3944 |
290
+ | 0.4325 | 1.9484 | 4760 | 0.3930 |
291
+ | 0.4343 | 1.9566 | 4780 | 0.3924 |
292
+ | 0.4287 | 1.9648 | 4800 | 0.3938 |
293
+ | 0.4322 | 1.9730 | 4820 | 0.3933 |
294
+ | 0.4283 | 1.9812 | 4840 | 0.3926 |
295
+ | 0.4309 | 1.9894 | 4860 | 0.3935 |
296
+ | 0.4238 | 1.9975 | 4880 | 0.3922 |
297
+ | 0.4217 | 2.0057 | 4900 | 0.3925 |
298
+ | 0.425 | 2.0139 | 4920 | 0.3926 |
299
+ | 0.4389 | 2.0221 | 4940 | 0.3925 |
300
+ | 0.4346 | 2.0303 | 4960 | 0.3920 |
301
+ | 0.4254 | 2.0385 | 4980 | 0.3931 |
302
+ | 0.4223 | 2.0467 | 5000 | 0.3919 |
303
+ | 0.4268 | 2.0549 | 5020 | 0.3930 |
304
+ | 0.4228 | 2.0630 | 5040 | 0.3929 |
305
+ | 0.4325 | 2.0712 | 5060 | 0.3928 |
306
+ | 0.4255 | 2.0794 | 5080 | 0.3928 |
307
+ | 0.4305 | 2.0876 | 5100 | 0.3922 |
308
+ | 0.4333 | 2.0958 | 5120 | 0.3919 |
309
+ | 0.4332 | 2.1040 | 5140 | 0.3927 |
310
+ | 0.4261 | 2.1122 | 5160 | 0.3929 |
311
+ | 0.429 | 2.1203 | 5180 | 0.3916 |
312
+ | 0.4274 | 2.1285 | 5200 | 0.3921 |
313
+ | 0.4277 | 2.1367 | 5220 | 0.3928 |
314
+ | 0.4356 | 2.1449 | 5240 | 0.3913 |
315
+ | 0.4268 | 2.1531 | 5260 | 0.3921 |
316
+ | 0.4297 | 2.1613 | 5280 | 0.3921 |
317
+ | 0.4272 | 2.1695 | 5300 | 0.3915 |
318
+ | 0.4337 | 2.1777 | 5320 | 0.3922 |
319
+ | 0.4312 | 2.1858 | 5340 | 0.3911 |
320
+ | 0.426 | 2.1940 | 5360 | 0.3917 |
321
+ | 0.4305 | 2.2022 | 5380 | 0.3925 |
322
+ | 0.4373 | 2.2104 | 5400 | 0.3919 |
323
+ | 0.4319 | 2.2186 | 5420 | 0.3914 |
324
+ | 0.43 | 2.2268 | 5440 | 0.3921 |
325
+ | 0.4307 | 2.2350 | 5460 | 0.3910 |
326
+ | 0.4352 | 2.2431 | 5480 | 0.3912 |
327
+ | 0.4323 | 2.2513 | 5500 | 0.3907 |
328
+ | 0.4255 | 2.2595 | 5520 | 0.3905 |
329
+ | 0.4286 | 2.2677 | 5540 | 0.3913 |
330
+ | 0.4271 | 2.2759 | 5560 | 0.3916 |
331
+ | 0.4319 | 2.2841 | 5580 | 0.3915 |
332
+ | 0.4175 | 2.2923 | 5600 | 0.3911 |
333
+ | 0.424 | 2.3005 | 5620 | 0.3914 |
334
+ | 0.4365 | 2.3086 | 5640 | 0.3907 |
335
+ | 0.4322 | 2.3168 | 5660 | 0.3906 |
336
+ | 0.4227 | 2.3250 | 5680 | 0.3910 |
337
+ | 0.4308 | 2.3332 | 5700 | 0.3909 |
338
+ | 0.4268 | 2.3414 | 5720 | 0.3910 |
339
+ | 0.4352 | 2.3496 | 5740 | 0.3911 |
340
+ | 0.4274 | 2.3578 | 5760 | 0.3898 |
341
+ | 0.4255 | 2.3659 | 5780 | 0.3901 |
342
+ | 0.4277 | 2.3741 | 5800 | 0.3903 |
343
+ | 0.4209 | 2.3823 | 5820 | 0.3905 |
344
+ | 0.4221 | 2.3905 | 5840 | 0.3911 |
345
+ | 0.4247 | 2.3987 | 5860 | 0.3911 |
346
+ | 0.4263 | 2.4069 | 5880 | 0.3910 |
347
+ | 0.4284 | 2.4151 | 5900 | 0.3912 |
348
+ | 0.4251 | 2.4233 | 5920 | 0.3910 |
349
+ | 0.4275 | 2.4314 | 5940 | 0.3908 |
350
+ | 0.4271 | 2.4396 | 5960 | 0.3904 |
351
+ | 0.4333 | 2.4478 | 5980 | 0.3904 |
352
+ | 0.4237 | 2.4560 | 6000 | 0.3903 |
353
+ | 0.4351 | 2.4642 | 6020 | 0.3903 |
354
+ | 0.4313 | 2.4724 | 6040 | 0.3902 |
355
+ | 0.4243 | 2.4806 | 6060 | 0.3910 |
356
+ | 0.4289 | 2.4887 | 6080 | 0.3907 |
357
+ | 0.4299 | 2.4969 | 6100 | 0.3909 |
358
+ | 0.428 | 2.5051 | 6120 | 0.3903 |
359
+ | 0.4202 | 2.5133 | 6140 | 0.3902 |
360
+ | 0.4291 | 2.5215 | 6160 | 0.3899 |
361
+ | 0.4344 | 2.5297 | 6180 | 0.3899 |
362
+ | 0.4256 | 2.5379 | 6200 | 0.3902 |
363
+ | 0.4227 | 2.5460 | 6220 | 0.3904 |
364
+ | 0.43 | 2.5542 | 6240 | 0.3907 |
365
+ | 0.4252 | 2.5624 | 6260 | 0.3900 |
366
+ | 0.4224 | 2.5706 | 6280 | 0.3909 |
367
+ | 0.4207 | 2.5788 | 6300 | 0.3909 |
368
+ | 0.4265 | 2.5870 | 6320 | 0.3906 |
369
+ | 0.4341 | 2.5952 | 6340 | 0.3907 |
370
+ | 0.4228 | 2.6034 | 6360 | 0.3903 |
371
+ | 0.4196 | 2.6115 | 6380 | 0.3904 |
372
+ | 0.4216 | 2.6197 | 6400 | 0.3897 |
373
+ | 0.4339 | 2.6279 | 6420 | 0.3904 |
374
+ | 0.4255 | 2.6361 | 6440 | 0.3903 |
375
+ | 0.4261 | 2.6443 | 6460 | 0.3905 |
376
+ | 0.43 | 2.6525 | 6480 | 0.3906 |
377
+ | 0.4265 | 2.6607 | 6500 | 0.3907 |
378
+ | 0.4279 | 2.6688 | 6520 | 0.3904 |
379
+ | 0.4298 | 2.6770 | 6540 | 0.3901 |
380
+ | 0.4312 | 2.6852 | 6560 | 0.3901 |
381
+ | 0.4199 | 2.6934 | 6580 | 0.3898 |
382
+ | 0.4288 | 2.7016 | 6600 | 0.3902 |
383
+ | 0.4325 | 2.7098 | 6620 | 0.3905 |
384
+ | 0.4246 | 2.7180 | 6640 | 0.3903 |
385
+ | 0.4281 | 2.7262 | 6660 | 0.3899 |
386
+ | 0.4296 | 2.7343 | 6680 | 0.3903 |
387
+ | 0.4247 | 2.7425 | 6700 | 0.3898 |
388
+ | 0.4252 | 2.7507 | 6720 | 0.3905 |
389
+ | 0.4255 | 2.7589 | 6740 | 0.3904 |
390
+ | 0.4282 | 2.7671 | 6760 | 0.3902 |
391
+ | 0.4225 | 2.7753 | 6780 | 0.3900 |
392
+ | 0.4251 | 2.7835 | 6800 | 0.3900 |
393
+ | 0.4201 | 2.7916 | 6820 | 0.3903 |
394
+ | 0.4252 | 2.7998 | 6840 | 0.3905 |
395
+ | 0.427 | 2.8080 | 6860 | 0.3907 |
396
+ | 0.428 | 2.8162 | 6880 | 0.3907 |
397
+ | 0.437 | 2.8244 | 6900 | 0.3900 |
398
+ | 0.4257 | 2.8326 | 6920 | 0.3901 |
399
+ | 0.4239 | 2.8408 | 6940 | 0.3905 |
400
+ | 0.4276 | 2.8490 | 6960 | 0.3902 |
401
+ | 0.4274 | 2.8571 | 6980 | 0.3897 |
402
+ | 0.4327 | 2.8653 | 7000 | 0.3902 |
403
+ | 0.4313 | 2.8735 | 7020 | 0.3896 |
404
+ | 0.4277 | 2.8817 | 7040 | 0.3904 |
405
+ | 0.4289 | 2.8899 | 7060 | 0.3904 |
406
+ | 0.4321 | 2.8981 | 7080 | 0.3900 |
407
+ | 0.4232 | 2.9063 | 7100 | 0.3902 |
408
+ | 0.4274 | 2.9144 | 7120 | 0.3901 |
409
+ | 0.4339 | 2.9226 | 7140 | 0.3901 |
410
+ | 0.4226 | 2.9308 | 7160 | 0.3904 |
411
+ | 0.4184 | 2.9390 | 7180 | 0.3902 |
412
+ | 0.4242 | 2.9472 | 7200 | 0.3901 |
413
+ | 0.4259 | 2.9554 | 7220 | 0.3902 |
414
+ | 0.4297 | 2.9636 | 7240 | 0.3897 |
415
+ | 0.4268 | 2.9718 | 7260 | 0.3900 |
416
+ | 0.4281 | 2.9799 | 7280 | 0.3900 |
417
+ | 0.4234 | 2.9881 | 7300 | 0.3901 |
418
+ | 0.4196 | 2.9963 | 7320 | 0.3900 |
419
+
420
+
421
+ ### Framework versions
422
+
423
+ - PEFT 0.15.1
424
+ - Transformers 4.51.3
425
+ - Pytorch 2.6.0+cu118
426
+ - Datasets 3.5.0
427
+ - Tokenizers 0.21.1
adapter_config.json ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "gpt2",
5
+ "bias": "none",
6
+ "corda_config": null,
7
+ "eva_config": null,
8
+ "exclude_modules": null,
9
+ "fan_in_fan_out": true,
10
+ "inference_mode": true,
11
+ "init_lora_weights": true,
12
+ "layer_replication": null,
13
+ "layers_pattern": null,
14
+ "layers_to_transform": null,
15
+ "loftq_config": {},
16
+ "lora_alpha": 32,
17
+ "lora_bias": false,
18
+ "lora_dropout": 0.05,
19
+ "megatron_config": null,
20
+ "megatron_core": "megatron.core",
21
+ "modules_to_save": null,
22
+ "peft_type": "LORA",
23
+ "r": 8,
24
+ "rank_pattern": {},
25
+ "revision": null,
26
+ "target_modules": [
27
+ "c_attn"
28
+ ],
29
+ "task_type": "CAUSAL_LM",
30
+ "trainable_token_indices": null,
31
+ "use_dora": false,
32
+ "use_rslora": false
33
+ }
adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8aaf332fb2b5a94ea406fe126d993b06ee1d77486027655733f14fa1dab70ad6
3
+ size 309980480
added_tokens.json ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ {
2
+ "<endofex>": 50259,
3
+ "<pad>": 50257,
4
+ "<startofex>": 50258
5
+ }
all_results.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "eval_loss": 0.38987866044044495,
4
+ "eval_runtime": 27.8251,
5
+ "eval_samples_per_second": 602.01,
6
+ "eval_steps_per_second": 37.628,
7
+ "perplexity": 1.4768015885562091,
8
+ "total_flos": 1.4983411323027456e+16,
9
+ "train_loss": 0.5561708558514981,
10
+ "train_runtime": 11243.154,
11
+ "train_samples_per_second": 20.858,
12
+ "train_steps_per_second": 0.652
13
+ }
eval_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "eval_loss": 0.38987866044044495,
4
+ "eval_runtime": 27.8251,
5
+ "eval_samples_per_second": 602.01,
6
+ "eval_steps_per_second": 37.628,
7
+ "perplexity": 1.4768015885562091
8
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
special_tokens_map.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ {
4
+ "content": "<startofex>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false
9
+ },
10
+ {
11
+ "content": "<endofex>",
12
+ "lstrip": false,
13
+ "normalized": false,
14
+ "rstrip": false,
15
+ "single_word": false
16
+ }
17
+ ],
18
+ "bos_token": "<|endoftext|>",
19
+ "eos_token": "<|endoftext|>",
20
+ "pad_token": {
21
+ "content": "<pad>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false
26
+ },
27
+ "unk_token": "<|endoftext|>"
28
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "50256": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "50257": {
13
+ "content": "<pad>",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "50258": {
21
+ "content": "<startofex>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "50259": {
29
+ "content": "<endofex>",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ }
36
+ },
37
+ "additional_special_tokens": [
38
+ "<startofex>",
39
+ "<endofex>"
40
+ ],
41
+ "bos_token": "<|endoftext|>",
42
+ "clean_up_tokenization_spaces": false,
43
+ "eos_token": "<|endoftext|>",
44
+ "extra_special_tokens": {},
45
+ "model_max_length": 1024,
46
+ "pad_token": "<pad>",
47
+ "tokenizer_class": "GPT2Tokenizer",
48
+ "unk_token": "<|endoftext|>"
49
+ }
train_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "total_flos": 1.4983411323027456e+16,
4
+ "train_loss": 0.5561708558514981,
5
+ "train_runtime": 11243.154,
6
+ "train_samples_per_second": 20.858,
7
+ "train_steps_per_second": 0.652
8
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:53564b62c1a8e9108e0cbed2d27783c81779e3b6b17ff3068341bc2e47576db5
3
+ size 5432
vocab.json ADDED
The diff for this file is too large to render. See raw diff