Xibanya commited on
Commit
53201f2
·
1 Parent(s): 4c39b3e

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +43 -3
README.md CHANGED
@@ -14,7 +14,7 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 3.4600
18
 
19
  ## Model description
20
 
@@ -36,11 +36,11 @@ The following hyperparameters were used during training:
36
  - learning_rate: 1.372e-07
37
  - train_batch_size: 1
38
  - eval_batch_size: 1
39
- - seed: 1523398255
40
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
  - lr_scheduler_type: linear
42
  - lr_scheduler_warmup_steps: 10
43
- - num_epochs: 30
44
  - mixed_precision_training: Native AMP
45
 
46
  ### Training results
@@ -65,6 +65,46 @@ The following hyperparameters were used during training:
65
  | 1.1578 | 28.0 | 18564 | 3.4939 |
66
  | 1.0987 | 29.0 | 19227 | 3.4947 |
67
  | 1.0779 | 30.0 | 19890 | 3.4972 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
68
 
69
 
70
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 3.4079
18
 
19
  ## Model description
20
 
 
36
  - learning_rate: 1.372e-07
37
  - train_batch_size: 1
38
  - eval_batch_size: 1
39
+ - seed: 3138344630
40
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
  - lr_scheduler_type: linear
42
  - lr_scheduler_warmup_steps: 10
43
+ - num_epochs: 100
44
  - mixed_precision_training: Native AMP
45
 
46
  ### Training results
 
65
  | 1.1578 | 28.0 | 18564 | 3.4939 |
66
  | 1.0987 | 29.0 | 19227 | 3.4947 |
67
  | 1.0779 | 30.0 | 19890 | 3.4972 |
68
+ | 1.3567 | 61.0 | 20191 | 3.4576 |
69
+ | 1.3278 | 62.0 | 20522 | 3.4528 |
70
+ | 1.3292 | 63.0 | 20853 | 3.4468 |
71
+ | 1.3285 | 64.0 | 21184 | 3.4431 |
72
+ | 1.3032 | 65.0 | 21515 | 3.4370 |
73
+ | 1.318 | 66.0 | 21846 | 3.4345 |
74
+ | 1.3003 | 67.0 | 22177 | 3.4289 |
75
+ | 1.3202 | 68.0 | 22508 | 3.4274 |
76
+ | 1.2643 | 69.0 | 22839 | 3.4232 |
77
+ | 1.2862 | 70.0 | 23170 | 3.4223 |
78
+ | 1.2597 | 71.0 | 23501 | 3.4186 |
79
+ | 1.2426 | 72.0 | 23832 | 3.4176 |
80
+ | 1.2539 | 73.0 | 24163 | 3.4152 |
81
+ | 1.2604 | 74.0 | 24494 | 3.4147 |
82
+ | 1.263 | 75.0 | 24825 | 3.4128 |
83
+ | 1.2642 | 76.0 | 25156 | 3.4127 |
84
+ | 1.2694 | 77.0 | 25487 | 3.4109 |
85
+ | 1.2251 | 78.0 | 25818 | 3.4106 |
86
+ | 1.2673 | 79.0 | 26149 | 3.4097 |
87
+ | 1.233 | 80.0 | 26480 | 3.4096 |
88
+ | 1.2408 | 81.0 | 26811 | 3.4087 |
89
+ | 1.2579 | 82.0 | 27142 | 3.4088 |
90
+ | 1.2346 | 83.0 | 27473 | 3.4081 |
91
+ | 1.2298 | 84.0 | 27804 | 3.4082 |
92
+ | 1.219 | 85.0 | 28135 | 3.4079 |
93
+ | 1.2515 | 86.0 | 28466 | 3.4080 |
94
+ | 1.2316 | 87.0 | 28797 | 3.4084 |
95
+ | 1.2085 | 88.0 | 29128 | 3.4085 |
96
+ | 1.2334 | 89.0 | 29459 | 3.4085 |
97
+ | 1.2263 | 90.0 | 29790 | 3.4084 |
98
+ | 1.2312 | 91.0 | 30121 | 3.4084 |
99
+ | 1.2584 | 92.0 | 30452 | 3.4086 |
100
+ | 1.2106 | 93.0 | 30783 | 3.4089 |
101
+ | 1.2078 | 94.0 | 31114 | 3.4091 |
102
+ | 1.2329 | 95.0 | 31445 | 3.4090 |
103
+ | 1.1836 | 96.0 | 31776 | 3.4097 |
104
+ | 1.2135 | 97.0 | 32107 | 3.4097 |
105
+ | 1.2372 | 98.0 | 32438 | 3.4099 |
106
+ | 1.2163 | 99.0 | 32769 | 3.4107 |
107
+ | 1.1937 | 100.0 | 33100 | 3.4110 |
108
 
109
 
110
  ### Framework versions