amixh commited on
Commit
e6d40bd
·
verified ·
1 Parent(s): de7ea8a

Upload folder using huggingface_hub

Browse files
README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: google-t5/t5-base
3
+ library_name: peft
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
200
+ ### Framework versions
201
+
202
+ - PEFT 0.14.0
adapter_config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "google-t5/t5-base",
5
+ "bias": "none",
6
+ "eva_config": null,
7
+ "exclude_modules": null,
8
+ "fan_in_fan_out": false,
9
+ "inference_mode": true,
10
+ "init_lora_weights": true,
11
+ "layer_replication": null,
12
+ "layers_pattern": null,
13
+ "layers_to_transform": null,
14
+ "loftq_config": {},
15
+ "lora_alpha": 16,
16
+ "lora_bias": false,
17
+ "lora_dropout": 0.1,
18
+ "megatron_config": null,
19
+ "megatron_core": "megatron.core",
20
+ "modules_to_save": null,
21
+ "peft_type": "LORA",
22
+ "r": 8,
23
+ "rank_pattern": {},
24
+ "revision": null,
25
+ "target_modules": [
26
+ "v",
27
+ "q"
28
+ ],
29
+ "task_type": "SEQ_2_SEQ_LM",
30
+ "use_dora": false,
31
+ "use_rslora": false
32
+ }
adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:51ff91d695cf029a9217fb18256652def386a924c8fd1b7393047c9ed586776a
3
+ size 3558888
optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:54b97daf1e26716177b294e2f9b94f7922f720f786e99a76f6bdde036af89f21
3
+ size 7198906
rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:03b4ac13c134dc5a909adf4032def310416b110ab850dfd4cfe65649b32f74aa
3
+ size 14244
scaler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9ddb0b226bbe8b1106efb06bd57e7a2d7916d4707d8a9b0065b0e67a9a15851c
3
+ size 988
scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9e18d0d4dc766d8bcafad9e36cb03b367541123dc8acb79977bde099252eee66
3
+ size 1064
special_tokens_map.json ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<extra_id_0>",
4
+ "<extra_id_1>",
5
+ "<extra_id_2>",
6
+ "<extra_id_3>",
7
+ "<extra_id_4>",
8
+ "<extra_id_5>",
9
+ "<extra_id_6>",
10
+ "<extra_id_7>",
11
+ "<extra_id_8>",
12
+ "<extra_id_9>",
13
+ "<extra_id_10>",
14
+ "<extra_id_11>",
15
+ "<extra_id_12>",
16
+ "<extra_id_13>",
17
+ "<extra_id_14>",
18
+ "<extra_id_15>",
19
+ "<extra_id_16>",
20
+ "<extra_id_17>",
21
+ "<extra_id_18>",
22
+ "<extra_id_19>",
23
+ "<extra_id_20>",
24
+ "<extra_id_21>",
25
+ "<extra_id_22>",
26
+ "<extra_id_23>",
27
+ "<extra_id_24>",
28
+ "<extra_id_25>",
29
+ "<extra_id_26>",
30
+ "<extra_id_27>",
31
+ "<extra_id_28>",
32
+ "<extra_id_29>",
33
+ "<extra_id_30>",
34
+ "<extra_id_31>",
35
+ "<extra_id_32>",
36
+ "<extra_id_33>",
37
+ "<extra_id_34>",
38
+ "<extra_id_35>",
39
+ "<extra_id_36>",
40
+ "<extra_id_37>",
41
+ "<extra_id_38>",
42
+ "<extra_id_39>",
43
+ "<extra_id_40>",
44
+ "<extra_id_41>",
45
+ "<extra_id_42>",
46
+ "<extra_id_43>",
47
+ "<extra_id_44>",
48
+ "<extra_id_45>",
49
+ "<extra_id_46>",
50
+ "<extra_id_47>",
51
+ "<extra_id_48>",
52
+ "<extra_id_49>",
53
+ "<extra_id_50>",
54
+ "<extra_id_51>",
55
+ "<extra_id_52>",
56
+ "<extra_id_53>",
57
+ "<extra_id_54>",
58
+ "<extra_id_55>",
59
+ "<extra_id_56>",
60
+ "<extra_id_57>",
61
+ "<extra_id_58>",
62
+ "<extra_id_59>",
63
+ "<extra_id_60>",
64
+ "<extra_id_61>",
65
+ "<extra_id_62>",
66
+ "<extra_id_63>",
67
+ "<extra_id_64>",
68
+ "<extra_id_65>",
69
+ "<extra_id_66>",
70
+ "<extra_id_67>",
71
+ "<extra_id_68>",
72
+ "<extra_id_69>",
73
+ "<extra_id_70>",
74
+ "<extra_id_71>",
75
+ "<extra_id_72>",
76
+ "<extra_id_73>",
77
+ "<extra_id_74>",
78
+ "<extra_id_75>",
79
+ "<extra_id_76>",
80
+ "<extra_id_77>",
81
+ "<extra_id_78>",
82
+ "<extra_id_79>",
83
+ "<extra_id_80>",
84
+ "<extra_id_81>",
85
+ "<extra_id_82>",
86
+ "<extra_id_83>",
87
+ "<extra_id_84>",
88
+ "<extra_id_85>",
89
+ "<extra_id_86>",
90
+ "<extra_id_87>",
91
+ "<extra_id_88>",
92
+ "<extra_id_89>",
93
+ "<extra_id_90>",
94
+ "<extra_id_91>",
95
+ "<extra_id_92>",
96
+ "<extra_id_93>",
97
+ "<extra_id_94>",
98
+ "<extra_id_95>",
99
+ "<extra_id_96>",
100
+ "<extra_id_97>",
101
+ "<extra_id_98>",
102
+ "<extra_id_99>"
103
+ ],
104
+ "eos_token": "</s>",
105
+ "pad_token": "<pad>",
106
+ "unk_token": "<unk>"
107
+ }
spiece.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d60acb128cf7b7f2536e8f38a5b18a05535c9e14c7a355904270e15b0945ea86
3
+ size 791656
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,939 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": null,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<pad>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "1": {
13
+ "content": "</s>",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "2": {
21
+ "content": "<unk>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "32000": {
29
+ "content": "<extra_id_99>",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "32001": {
37
+ "content": "<extra_id_98>",
38
+ "lstrip": false,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ },
44
+ "32002": {
45
+ "content": "<extra_id_97>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": true
51
+ },
52
+ "32003": {
53
+ "content": "<extra_id_96>",
54
+ "lstrip": false,
55
+ "normalized": false,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": true
59
+ },
60
+ "32004": {
61
+ "content": "<extra_id_95>",
62
+ "lstrip": false,
63
+ "normalized": false,
64
+ "rstrip": false,
65
+ "single_word": false,
66
+ "special": true
67
+ },
68
+ "32005": {
69
+ "content": "<extra_id_94>",
70
+ "lstrip": false,
71
+ "normalized": false,
72
+ "rstrip": false,
73
+ "single_word": false,
74
+ "special": true
75
+ },
76
+ "32006": {
77
+ "content": "<extra_id_93>",
78
+ "lstrip": false,
79
+ "normalized": false,
80
+ "rstrip": false,
81
+ "single_word": false,
82
+ "special": true
83
+ },
84
+ "32007": {
85
+ "content": "<extra_id_92>",
86
+ "lstrip": false,
87
+ "normalized": false,
88
+ "rstrip": false,
89
+ "single_word": false,
90
+ "special": true
91
+ },
92
+ "32008": {
93
+ "content": "<extra_id_91>",
94
+ "lstrip": false,
95
+ "normalized": false,
96
+ "rstrip": false,
97
+ "single_word": false,
98
+ "special": true
99
+ },
100
+ "32009": {
101
+ "content": "<extra_id_90>",
102
+ "lstrip": false,
103
+ "normalized": false,
104
+ "rstrip": false,
105
+ "single_word": false,
106
+ "special": true
107
+ },
108
+ "32010": {
109
+ "content": "<extra_id_89>",
110
+ "lstrip": false,
111
+ "normalized": false,
112
+ "rstrip": false,
113
+ "single_word": false,
114
+ "special": true
115
+ },
116
+ "32011": {
117
+ "content": "<extra_id_88>",
118
+ "lstrip": false,
119
+ "normalized": false,
120
+ "rstrip": false,
121
+ "single_word": false,
122
+ "special": true
123
+ },
124
+ "32012": {
125
+ "content": "<extra_id_87>",
126
+ "lstrip": false,
127
+ "normalized": false,
128
+ "rstrip": false,
129
+ "single_word": false,
130
+ "special": true
131
+ },
132
+ "32013": {
133
+ "content": "<extra_id_86>",
134
+ "lstrip": false,
135
+ "normalized": false,
136
+ "rstrip": false,
137
+ "single_word": false,
138
+ "special": true
139
+ },
140
+ "32014": {
141
+ "content": "<extra_id_85>",
142
+ "lstrip": false,
143
+ "normalized": false,
144
+ "rstrip": false,
145
+ "single_word": false,
146
+ "special": true
147
+ },
148
+ "32015": {
149
+ "content": "<extra_id_84>",
150
+ "lstrip": false,
151
+ "normalized": false,
152
+ "rstrip": false,
153
+ "single_word": false,
154
+ "special": true
155
+ },
156
+ "32016": {
157
+ "content": "<extra_id_83>",
158
+ "lstrip": false,
159
+ "normalized": false,
160
+ "rstrip": false,
161
+ "single_word": false,
162
+ "special": true
163
+ },
164
+ "32017": {
165
+ "content": "<extra_id_82>",
166
+ "lstrip": false,
167
+ "normalized": false,
168
+ "rstrip": false,
169
+ "single_word": false,
170
+ "special": true
171
+ },
172
+ "32018": {
173
+ "content": "<extra_id_81>",
174
+ "lstrip": false,
175
+ "normalized": false,
176
+ "rstrip": false,
177
+ "single_word": false,
178
+ "special": true
179
+ },
180
+ "32019": {
181
+ "content": "<extra_id_80>",
182
+ "lstrip": false,
183
+ "normalized": false,
184
+ "rstrip": false,
185
+ "single_word": false,
186
+ "special": true
187
+ },
188
+ "32020": {
189
+ "content": "<extra_id_79>",
190
+ "lstrip": false,
191
+ "normalized": false,
192
+ "rstrip": false,
193
+ "single_word": false,
194
+ "special": true
195
+ },
196
+ "32021": {
197
+ "content": "<extra_id_78>",
198
+ "lstrip": false,
199
+ "normalized": false,
200
+ "rstrip": false,
201
+ "single_word": false,
202
+ "special": true
203
+ },
204
+ "32022": {
205
+ "content": "<extra_id_77>",
206
+ "lstrip": false,
207
+ "normalized": false,
208
+ "rstrip": false,
209
+ "single_word": false,
210
+ "special": true
211
+ },
212
+ "32023": {
213
+ "content": "<extra_id_76>",
214
+ "lstrip": false,
215
+ "normalized": false,
216
+ "rstrip": false,
217
+ "single_word": false,
218
+ "special": true
219
+ },
220
+ "32024": {
221
+ "content": "<extra_id_75>",
222
+ "lstrip": false,
223
+ "normalized": false,
224
+ "rstrip": false,
225
+ "single_word": false,
226
+ "special": true
227
+ },
228
+ "32025": {
229
+ "content": "<extra_id_74>",
230
+ "lstrip": false,
231
+ "normalized": false,
232
+ "rstrip": false,
233
+ "single_word": false,
234
+ "special": true
235
+ },
236
+ "32026": {
237
+ "content": "<extra_id_73>",
238
+ "lstrip": false,
239
+ "normalized": false,
240
+ "rstrip": false,
241
+ "single_word": false,
242
+ "special": true
243
+ },
244
+ "32027": {
245
+ "content": "<extra_id_72>",
246
+ "lstrip": false,
247
+ "normalized": false,
248
+ "rstrip": false,
249
+ "single_word": false,
250
+ "special": true
251
+ },
252
+ "32028": {
253
+ "content": "<extra_id_71>",
254
+ "lstrip": false,
255
+ "normalized": false,
256
+ "rstrip": false,
257
+ "single_word": false,
258
+ "special": true
259
+ },
260
+ "32029": {
261
+ "content": "<extra_id_70>",
262
+ "lstrip": false,
263
+ "normalized": false,
264
+ "rstrip": false,
265
+ "single_word": false,
266
+ "special": true
267
+ },
268
+ "32030": {
269
+ "content": "<extra_id_69>",
270
+ "lstrip": false,
271
+ "normalized": false,
272
+ "rstrip": false,
273
+ "single_word": false,
274
+ "special": true
275
+ },
276
+ "32031": {
277
+ "content": "<extra_id_68>",
278
+ "lstrip": false,
279
+ "normalized": false,
280
+ "rstrip": false,
281
+ "single_word": false,
282
+ "special": true
283
+ },
284
+ "32032": {
285
+ "content": "<extra_id_67>",
286
+ "lstrip": false,
287
+ "normalized": false,
288
+ "rstrip": false,
289
+ "single_word": false,
290
+ "special": true
291
+ },
292
+ "32033": {
293
+ "content": "<extra_id_66>",
294
+ "lstrip": false,
295
+ "normalized": false,
296
+ "rstrip": false,
297
+ "single_word": false,
298
+ "special": true
299
+ },
300
+ "32034": {
301
+ "content": "<extra_id_65>",
302
+ "lstrip": false,
303
+ "normalized": false,
304
+ "rstrip": false,
305
+ "single_word": false,
306
+ "special": true
307
+ },
308
+ "32035": {
309
+ "content": "<extra_id_64>",
310
+ "lstrip": false,
311
+ "normalized": false,
312
+ "rstrip": false,
313
+ "single_word": false,
314
+ "special": true
315
+ },
316
+ "32036": {
317
+ "content": "<extra_id_63>",
318
+ "lstrip": false,
319
+ "normalized": false,
320
+ "rstrip": false,
321
+ "single_word": false,
322
+ "special": true
323
+ },
324
+ "32037": {
325
+ "content": "<extra_id_62>",
326
+ "lstrip": false,
327
+ "normalized": false,
328
+ "rstrip": false,
329
+ "single_word": false,
330
+ "special": true
331
+ },
332
+ "32038": {
333
+ "content": "<extra_id_61>",
334
+ "lstrip": false,
335
+ "normalized": false,
336
+ "rstrip": false,
337
+ "single_word": false,
338
+ "special": true
339
+ },
340
+ "32039": {
341
+ "content": "<extra_id_60>",
342
+ "lstrip": false,
343
+ "normalized": false,
344
+ "rstrip": false,
345
+ "single_word": false,
346
+ "special": true
347
+ },
348
+ "32040": {
349
+ "content": "<extra_id_59>",
350
+ "lstrip": false,
351
+ "normalized": false,
352
+ "rstrip": false,
353
+ "single_word": false,
354
+ "special": true
355
+ },
356
+ "32041": {
357
+ "content": "<extra_id_58>",
358
+ "lstrip": false,
359
+ "normalized": false,
360
+ "rstrip": false,
361
+ "single_word": false,
362
+ "special": true
363
+ },
364
+ "32042": {
365
+ "content": "<extra_id_57>",
366
+ "lstrip": false,
367
+ "normalized": false,
368
+ "rstrip": false,
369
+ "single_word": false,
370
+ "special": true
371
+ },
372
+ "32043": {
373
+ "content": "<extra_id_56>",
374
+ "lstrip": false,
375
+ "normalized": false,
376
+ "rstrip": false,
377
+ "single_word": false,
378
+ "special": true
379
+ },
380
+ "32044": {
381
+ "content": "<extra_id_55>",
382
+ "lstrip": false,
383
+ "normalized": false,
384
+ "rstrip": false,
385
+ "single_word": false,
386
+ "special": true
387
+ },
388
+ "32045": {
389
+ "content": "<extra_id_54>",
390
+ "lstrip": false,
391
+ "normalized": false,
392
+ "rstrip": false,
393
+ "single_word": false,
394
+ "special": true
395
+ },
396
+ "32046": {
397
+ "content": "<extra_id_53>",
398
+ "lstrip": false,
399
+ "normalized": false,
400
+ "rstrip": false,
401
+ "single_word": false,
402
+ "special": true
403
+ },
404
+ "32047": {
405
+ "content": "<extra_id_52>",
406
+ "lstrip": false,
407
+ "normalized": false,
408
+ "rstrip": false,
409
+ "single_word": false,
410
+ "special": true
411
+ },
412
+ "32048": {
413
+ "content": "<extra_id_51>",
414
+ "lstrip": false,
415
+ "normalized": false,
416
+ "rstrip": false,
417
+ "single_word": false,
418
+ "special": true
419
+ },
420
+ "32049": {
421
+ "content": "<extra_id_50>",
422
+ "lstrip": false,
423
+ "normalized": false,
424
+ "rstrip": false,
425
+ "single_word": false,
426
+ "special": true
427
+ },
428
+ "32050": {
429
+ "content": "<extra_id_49>",
430
+ "lstrip": false,
431
+ "normalized": false,
432
+ "rstrip": false,
433
+ "single_word": false,
434
+ "special": true
435
+ },
436
+ "32051": {
437
+ "content": "<extra_id_48>",
438
+ "lstrip": false,
439
+ "normalized": false,
440
+ "rstrip": false,
441
+ "single_word": false,
442
+ "special": true
443
+ },
444
+ "32052": {
445
+ "content": "<extra_id_47>",
446
+ "lstrip": false,
447
+ "normalized": false,
448
+ "rstrip": false,
449
+ "single_word": false,
450
+ "special": true
451
+ },
452
+ "32053": {
453
+ "content": "<extra_id_46>",
454
+ "lstrip": false,
455
+ "normalized": false,
456
+ "rstrip": false,
457
+ "single_word": false,
458
+ "special": true
459
+ },
460
+ "32054": {
461
+ "content": "<extra_id_45>",
462
+ "lstrip": false,
463
+ "normalized": false,
464
+ "rstrip": false,
465
+ "single_word": false,
466
+ "special": true
467
+ },
468
+ "32055": {
469
+ "content": "<extra_id_44>",
470
+ "lstrip": false,
471
+ "normalized": false,
472
+ "rstrip": false,
473
+ "single_word": false,
474
+ "special": true
475
+ },
476
+ "32056": {
477
+ "content": "<extra_id_43>",
478
+ "lstrip": false,
479
+ "normalized": false,
480
+ "rstrip": false,
481
+ "single_word": false,
482
+ "special": true
483
+ },
484
+ "32057": {
485
+ "content": "<extra_id_42>",
486
+ "lstrip": false,
487
+ "normalized": false,
488
+ "rstrip": false,
489
+ "single_word": false,
490
+ "special": true
491
+ },
492
+ "32058": {
493
+ "content": "<extra_id_41>",
494
+ "lstrip": false,
495
+ "normalized": false,
496
+ "rstrip": false,
497
+ "single_word": false,
498
+ "special": true
499
+ },
500
+ "32059": {
501
+ "content": "<extra_id_40>",
502
+ "lstrip": false,
503
+ "normalized": false,
504
+ "rstrip": false,
505
+ "single_word": false,
506
+ "special": true
507
+ },
508
+ "32060": {
509
+ "content": "<extra_id_39>",
510
+ "lstrip": false,
511
+ "normalized": false,
512
+ "rstrip": false,
513
+ "single_word": false,
514
+ "special": true
515
+ },
516
+ "32061": {
517
+ "content": "<extra_id_38>",
518
+ "lstrip": false,
519
+ "normalized": false,
520
+ "rstrip": false,
521
+ "single_word": false,
522
+ "special": true
523
+ },
524
+ "32062": {
525
+ "content": "<extra_id_37>",
526
+ "lstrip": false,
527
+ "normalized": false,
528
+ "rstrip": false,
529
+ "single_word": false,
530
+ "special": true
531
+ },
532
+ "32063": {
533
+ "content": "<extra_id_36>",
534
+ "lstrip": false,
535
+ "normalized": false,
536
+ "rstrip": false,
537
+ "single_word": false,
538
+ "special": true
539
+ },
540
+ "32064": {
541
+ "content": "<extra_id_35>",
542
+ "lstrip": false,
543
+ "normalized": false,
544
+ "rstrip": false,
545
+ "single_word": false,
546
+ "special": true
547
+ },
548
+ "32065": {
549
+ "content": "<extra_id_34>",
550
+ "lstrip": false,
551
+ "normalized": false,
552
+ "rstrip": false,
553
+ "single_word": false,
554
+ "special": true
555
+ },
556
+ "32066": {
557
+ "content": "<extra_id_33>",
558
+ "lstrip": false,
559
+ "normalized": false,
560
+ "rstrip": false,
561
+ "single_word": false,
562
+ "special": true
563
+ },
564
+ "32067": {
565
+ "content": "<extra_id_32>",
566
+ "lstrip": false,
567
+ "normalized": false,
568
+ "rstrip": false,
569
+ "single_word": false,
570
+ "special": true
571
+ },
572
+ "32068": {
573
+ "content": "<extra_id_31>",
574
+ "lstrip": false,
575
+ "normalized": false,
576
+ "rstrip": false,
577
+ "single_word": false,
578
+ "special": true
579
+ },
580
+ "32069": {
581
+ "content": "<extra_id_30>",
582
+ "lstrip": false,
583
+ "normalized": false,
584
+ "rstrip": false,
585
+ "single_word": false,
586
+ "special": true
587
+ },
588
+ "32070": {
589
+ "content": "<extra_id_29>",
590
+ "lstrip": false,
591
+ "normalized": false,
592
+ "rstrip": false,
593
+ "single_word": false,
594
+ "special": true
595
+ },
596
+ "32071": {
597
+ "content": "<extra_id_28>",
598
+ "lstrip": false,
599
+ "normalized": false,
600
+ "rstrip": false,
601
+ "single_word": false,
602
+ "special": true
603
+ },
604
+ "32072": {
605
+ "content": "<extra_id_27>",
606
+ "lstrip": false,
607
+ "normalized": false,
608
+ "rstrip": false,
609
+ "single_word": false,
610
+ "special": true
611
+ },
612
+ "32073": {
613
+ "content": "<extra_id_26>",
614
+ "lstrip": false,
615
+ "normalized": false,
616
+ "rstrip": false,
617
+ "single_word": false,
618
+ "special": true
619
+ },
620
+ "32074": {
621
+ "content": "<extra_id_25>",
622
+ "lstrip": false,
623
+ "normalized": false,
624
+ "rstrip": false,
625
+ "single_word": false,
626
+ "special": true
627
+ },
628
+ "32075": {
629
+ "content": "<extra_id_24>",
630
+ "lstrip": false,
631
+ "normalized": false,
632
+ "rstrip": false,
633
+ "single_word": false,
634
+ "special": true
635
+ },
636
+ "32076": {
637
+ "content": "<extra_id_23>",
638
+ "lstrip": false,
639
+ "normalized": false,
640
+ "rstrip": false,
641
+ "single_word": false,
642
+ "special": true
643
+ },
644
+ "32077": {
645
+ "content": "<extra_id_22>",
646
+ "lstrip": false,
647
+ "normalized": false,
648
+ "rstrip": false,
649
+ "single_word": false,
650
+ "special": true
651
+ },
652
+ "32078": {
653
+ "content": "<extra_id_21>",
654
+ "lstrip": false,
655
+ "normalized": false,
656
+ "rstrip": false,
657
+ "single_word": false,
658
+ "special": true
659
+ },
660
+ "32079": {
661
+ "content": "<extra_id_20>",
662
+ "lstrip": false,
663
+ "normalized": false,
664
+ "rstrip": false,
665
+ "single_word": false,
666
+ "special": true
667
+ },
668
+ "32080": {
669
+ "content": "<extra_id_19>",
670
+ "lstrip": false,
671
+ "normalized": false,
672
+ "rstrip": false,
673
+ "single_word": false,
674
+ "special": true
675
+ },
676
+ "32081": {
677
+ "content": "<extra_id_18>",
678
+ "lstrip": false,
679
+ "normalized": false,
680
+ "rstrip": false,
681
+ "single_word": false,
682
+ "special": true
683
+ },
684
+ "32082": {
685
+ "content": "<extra_id_17>",
686
+ "lstrip": false,
687
+ "normalized": false,
688
+ "rstrip": false,
689
+ "single_word": false,
690
+ "special": true
691
+ },
692
+ "32083": {
693
+ "content": "<extra_id_16>",
694
+ "lstrip": false,
695
+ "normalized": false,
696
+ "rstrip": false,
697
+ "single_word": false,
698
+ "special": true
699
+ },
700
+ "32084": {
701
+ "content": "<extra_id_15>",
702
+ "lstrip": false,
703
+ "normalized": false,
704
+ "rstrip": false,
705
+ "single_word": false,
706
+ "special": true
707
+ },
708
+ "32085": {
709
+ "content": "<extra_id_14>",
710
+ "lstrip": false,
711
+ "normalized": false,
712
+ "rstrip": false,
713
+ "single_word": false,
714
+ "special": true
715
+ },
716
+ "32086": {
717
+ "content": "<extra_id_13>",
718
+ "lstrip": false,
719
+ "normalized": false,
720
+ "rstrip": false,
721
+ "single_word": false,
722
+ "special": true
723
+ },
724
+ "32087": {
725
+ "content": "<extra_id_12>",
726
+ "lstrip": false,
727
+ "normalized": false,
728
+ "rstrip": false,
729
+ "single_word": false,
730
+ "special": true
731
+ },
732
+ "32088": {
733
+ "content": "<extra_id_11>",
734
+ "lstrip": false,
735
+ "normalized": false,
736
+ "rstrip": false,
737
+ "single_word": false,
738
+ "special": true
739
+ },
740
+ "32089": {
741
+ "content": "<extra_id_10>",
742
+ "lstrip": false,
743
+ "normalized": false,
744
+ "rstrip": false,
745
+ "single_word": false,
746
+ "special": true
747
+ },
748
+ "32090": {
749
+ "content": "<extra_id_9>",
750
+ "lstrip": false,
751
+ "normalized": false,
752
+ "rstrip": false,
753
+ "single_word": false,
754
+ "special": true
755
+ },
756
+ "32091": {
757
+ "content": "<extra_id_8>",
758
+ "lstrip": false,
759
+ "normalized": false,
760
+ "rstrip": false,
761
+ "single_word": false,
762
+ "special": true
763
+ },
764
+ "32092": {
765
+ "content": "<extra_id_7>",
766
+ "lstrip": false,
767
+ "normalized": false,
768
+ "rstrip": false,
769
+ "single_word": false,
770
+ "special": true
771
+ },
772
+ "32093": {
773
+ "content": "<extra_id_6>",
774
+ "lstrip": false,
775
+ "normalized": false,
776
+ "rstrip": false,
777
+ "single_word": false,
778
+ "special": true
779
+ },
780
+ "32094": {
781
+ "content": "<extra_id_5>",
782
+ "lstrip": false,
783
+ "normalized": false,
784
+ "rstrip": false,
785
+ "single_word": false,
786
+ "special": true
787
+ },
788
+ "32095": {
789
+ "content": "<extra_id_4>",
790
+ "lstrip": false,
791
+ "normalized": false,
792
+ "rstrip": false,
793
+ "single_word": false,
794
+ "special": true
795
+ },
796
+ "32096": {
797
+ "content": "<extra_id_3>",
798
+ "lstrip": false,
799
+ "normalized": false,
800
+ "rstrip": false,
801
+ "single_word": false,
802
+ "special": true
803
+ },
804
+ "32097": {
805
+ "content": "<extra_id_2>",
806
+ "lstrip": false,
807
+ "normalized": false,
808
+ "rstrip": false,
809
+ "single_word": false,
810
+ "special": true
811
+ },
812
+ "32098": {
813
+ "content": "<extra_id_1>",
814
+ "lstrip": false,
815
+ "normalized": false,
816
+ "rstrip": false,
817
+ "single_word": false,
818
+ "special": true
819
+ },
820
+ "32099": {
821
+ "content": "<extra_id_0>",
822
+ "lstrip": false,
823
+ "normalized": false,
824
+ "rstrip": false,
825
+ "single_word": false,
826
+ "special": true
827
+ }
828
+ },
829
+ "additional_special_tokens": [
830
+ "<extra_id_0>",
831
+ "<extra_id_1>",
832
+ "<extra_id_2>",
833
+ "<extra_id_3>",
834
+ "<extra_id_4>",
835
+ "<extra_id_5>",
836
+ "<extra_id_6>",
837
+ "<extra_id_7>",
838
+ "<extra_id_8>",
839
+ "<extra_id_9>",
840
+ "<extra_id_10>",
841
+ "<extra_id_11>",
842
+ "<extra_id_12>",
843
+ "<extra_id_13>",
844
+ "<extra_id_14>",
845
+ "<extra_id_15>",
846
+ "<extra_id_16>",
847
+ "<extra_id_17>",
848
+ "<extra_id_18>",
849
+ "<extra_id_19>",
850
+ "<extra_id_20>",
851
+ "<extra_id_21>",
852
+ "<extra_id_22>",
853
+ "<extra_id_23>",
854
+ "<extra_id_24>",
855
+ "<extra_id_25>",
856
+ "<extra_id_26>",
857
+ "<extra_id_27>",
858
+ "<extra_id_28>",
859
+ "<extra_id_29>",
860
+ "<extra_id_30>",
861
+ "<extra_id_31>",
862
+ "<extra_id_32>",
863
+ "<extra_id_33>",
864
+ "<extra_id_34>",
865
+ "<extra_id_35>",
866
+ "<extra_id_36>",
867
+ "<extra_id_37>",
868
+ "<extra_id_38>",
869
+ "<extra_id_39>",
870
+ "<extra_id_40>",
871
+ "<extra_id_41>",
872
+ "<extra_id_42>",
873
+ "<extra_id_43>",
874
+ "<extra_id_44>",
875
+ "<extra_id_45>",
876
+ "<extra_id_46>",
877
+ "<extra_id_47>",
878
+ "<extra_id_48>",
879
+ "<extra_id_49>",
880
+ "<extra_id_50>",
881
+ "<extra_id_51>",
882
+ "<extra_id_52>",
883
+ "<extra_id_53>",
884
+ "<extra_id_54>",
885
+ "<extra_id_55>",
886
+ "<extra_id_56>",
887
+ "<extra_id_57>",
888
+ "<extra_id_58>",
889
+ "<extra_id_59>",
890
+ "<extra_id_60>",
891
+ "<extra_id_61>",
892
+ "<extra_id_62>",
893
+ "<extra_id_63>",
894
+ "<extra_id_64>",
895
+ "<extra_id_65>",
896
+ "<extra_id_66>",
897
+ "<extra_id_67>",
898
+ "<extra_id_68>",
899
+ "<extra_id_69>",
900
+ "<extra_id_70>",
901
+ "<extra_id_71>",
902
+ "<extra_id_72>",
903
+ "<extra_id_73>",
904
+ "<extra_id_74>",
905
+ "<extra_id_75>",
906
+ "<extra_id_76>",
907
+ "<extra_id_77>",
908
+ "<extra_id_78>",
909
+ "<extra_id_79>",
910
+ "<extra_id_80>",
911
+ "<extra_id_81>",
912
+ "<extra_id_82>",
913
+ "<extra_id_83>",
914
+ "<extra_id_84>",
915
+ "<extra_id_85>",
916
+ "<extra_id_86>",
917
+ "<extra_id_87>",
918
+ "<extra_id_88>",
919
+ "<extra_id_89>",
920
+ "<extra_id_90>",
921
+ "<extra_id_91>",
922
+ "<extra_id_92>",
923
+ "<extra_id_93>",
924
+ "<extra_id_94>",
925
+ "<extra_id_95>",
926
+ "<extra_id_96>",
927
+ "<extra_id_97>",
928
+ "<extra_id_98>",
929
+ "<extra_id_99>"
930
+ ],
931
+ "clean_up_tokenization_spaces": false,
932
+ "eos_token": "</s>",
933
+ "extra_ids": 100,
934
+ "extra_special_tokens": {},
935
+ "model_max_length": 1000000000000000019884624838656,
936
+ "pad_token": "<pad>",
937
+ "tokenizer_class": "T5Tokenizer",
938
+ "unk_token": "<unk>"
939
+ }
trainer_state.json ADDED
@@ -0,0 +1,1688 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": null,
3
+ "best_metric": null,
4
+ "best_model_checkpoint": null,
5
+ "epoch": 10.0,
6
+ "eval_steps": 500,
7
+ "global_step": 2260,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.04424778761061947,
14
+ "grad_norm": NaN,
15
+ "learning_rate": 0.0009973451327433627,
16
+ "loss": 14.6523,
17
+ "step": 10
18
+ },
19
+ {
20
+ "epoch": 0.08849557522123894,
21
+ "grad_norm": 0.9945968985557556,
22
+ "learning_rate": 0.000992920353982301,
23
+ "loss": 4.8948,
24
+ "step": 20
25
+ },
26
+ {
27
+ "epoch": 0.13274336283185842,
28
+ "grad_norm": 0.3495340645313263,
29
+ "learning_rate": 0.000988495575221239,
30
+ "loss": 0.6469,
31
+ "step": 30
32
+ },
33
+ {
34
+ "epoch": 0.17699115044247787,
35
+ "grad_norm": 0.22641977667808533,
36
+ "learning_rate": 0.000984070796460177,
37
+ "loss": 0.5221,
38
+ "step": 40
39
+ },
40
+ {
41
+ "epoch": 0.22123893805309736,
42
+ "grad_norm": 0.25233855843544006,
43
+ "learning_rate": 0.000979646017699115,
44
+ "loss": 0.4094,
45
+ "step": 50
46
+ },
47
+ {
48
+ "epoch": 0.26548672566371684,
49
+ "grad_norm": 0.37399861216545105,
50
+ "learning_rate": 0.0009752212389380531,
51
+ "loss": 0.3958,
52
+ "step": 60
53
+ },
54
+ {
55
+ "epoch": 0.30973451327433627,
56
+ "grad_norm": 0.18545609712600708,
57
+ "learning_rate": 0.0009707964601769911,
58
+ "loss": 0.3405,
59
+ "step": 70
60
+ },
61
+ {
62
+ "epoch": 0.35398230088495575,
63
+ "grad_norm": 0.2712928354740143,
64
+ "learning_rate": 0.0009663716814159293,
65
+ "loss": 0.3242,
66
+ "step": 80
67
+ },
68
+ {
69
+ "epoch": 0.39823008849557523,
70
+ "grad_norm": 0.2340475469827652,
71
+ "learning_rate": 0.0009619469026548673,
72
+ "loss": 0.3007,
73
+ "step": 90
74
+ },
75
+ {
76
+ "epoch": 0.4424778761061947,
77
+ "grad_norm": 0.18099136650562286,
78
+ "learning_rate": 0.0009575221238938053,
79
+ "loss": 0.2567,
80
+ "step": 100
81
+ },
82
+ {
83
+ "epoch": 0.48672566371681414,
84
+ "grad_norm": 0.23833367228507996,
85
+ "learning_rate": 0.0009530973451327434,
86
+ "loss": 0.2734,
87
+ "step": 110
88
+ },
89
+ {
90
+ "epoch": 0.5309734513274337,
91
+ "grad_norm": 0.20163732767105103,
92
+ "learning_rate": 0.0009486725663716814,
93
+ "loss": 0.2326,
94
+ "step": 120
95
+ },
96
+ {
97
+ "epoch": 0.5752212389380531,
98
+ "grad_norm": 0.1758851557970047,
99
+ "learning_rate": 0.0009442477876106195,
100
+ "loss": 0.2914,
101
+ "step": 130
102
+ },
103
+ {
104
+ "epoch": 0.6194690265486725,
105
+ "grad_norm": 0.211241215467453,
106
+ "learning_rate": 0.0009398230088495575,
107
+ "loss": 0.2667,
108
+ "step": 140
109
+ },
110
+ {
111
+ "epoch": 0.6637168141592921,
112
+ "grad_norm": 0.22571340203285217,
113
+ "learning_rate": 0.0009353982300884956,
114
+ "loss": 0.2268,
115
+ "step": 150
116
+ },
117
+ {
118
+ "epoch": 0.7079646017699115,
119
+ "grad_norm": 0.20469224452972412,
120
+ "learning_rate": 0.0009309734513274336,
121
+ "loss": 0.2386,
122
+ "step": 160
123
+ },
124
+ {
125
+ "epoch": 0.7522123893805309,
126
+ "grad_norm": 0.21183688938617706,
127
+ "learning_rate": 0.0009265486725663716,
128
+ "loss": 0.282,
129
+ "step": 170
130
+ },
131
+ {
132
+ "epoch": 0.7964601769911505,
133
+ "grad_norm": 0.17585916817188263,
134
+ "learning_rate": 0.0009221238938053097,
135
+ "loss": 0.3046,
136
+ "step": 180
137
+ },
138
+ {
139
+ "epoch": 0.8407079646017699,
140
+ "grad_norm": 0.17937427759170532,
141
+ "learning_rate": 0.0009176991150442479,
142
+ "loss": 0.2693,
143
+ "step": 190
144
+ },
145
+ {
146
+ "epoch": 0.8849557522123894,
147
+ "grad_norm": 0.19432350993156433,
148
+ "learning_rate": 0.0009132743362831859,
149
+ "loss": 0.252,
150
+ "step": 200
151
+ },
152
+ {
153
+ "epoch": 0.9292035398230089,
154
+ "grad_norm": 0.18185169994831085,
155
+ "learning_rate": 0.0009088495575221239,
156
+ "loss": 0.2793,
157
+ "step": 210
158
+ },
159
+ {
160
+ "epoch": 0.9734513274336283,
161
+ "grad_norm": 0.18515343964099884,
162
+ "learning_rate": 0.000904424778761062,
163
+ "loss": 0.2644,
164
+ "step": 220
165
+ },
166
+ {
167
+ "epoch": 1.0,
168
+ "eval_loss": 0.22543948888778687,
169
+ "eval_runtime": 3.0243,
170
+ "eval_samples_per_second": 33.066,
171
+ "eval_steps_per_second": 8.266,
172
+ "step": 226
173
+ },
174
+ {
175
+ "epoch": 1.0176991150442478,
176
+ "grad_norm": 0.2031005322933197,
177
+ "learning_rate": 0.0009000000000000001,
178
+ "loss": 0.2704,
179
+ "step": 230
180
+ },
181
+ {
182
+ "epoch": 1.0619469026548674,
183
+ "grad_norm": 0.26087555289268494,
184
+ "learning_rate": 0.0008955752212389381,
185
+ "loss": 0.2526,
186
+ "step": 240
187
+ },
188
+ {
189
+ "epoch": 1.1061946902654867,
190
+ "grad_norm": 0.1796620637178421,
191
+ "learning_rate": 0.0008911504424778761,
192
+ "loss": 0.2605,
193
+ "step": 250
194
+ },
195
+ {
196
+ "epoch": 1.1504424778761062,
197
+ "grad_norm": 0.22667303681373596,
198
+ "learning_rate": 0.0008867256637168141,
199
+ "loss": 0.261,
200
+ "step": 260
201
+ },
202
+ {
203
+ "epoch": 1.1946902654867257,
204
+ "grad_norm": 0.22089733183383942,
205
+ "learning_rate": 0.0008823008849557523,
206
+ "loss": 0.2762,
207
+ "step": 270
208
+ },
209
+ {
210
+ "epoch": 1.238938053097345,
211
+ "grad_norm": 0.19162122905254364,
212
+ "learning_rate": 0.0008778761061946903,
213
+ "loss": 0.2325,
214
+ "step": 280
215
+ },
216
+ {
217
+ "epoch": 1.2831858407079646,
218
+ "grad_norm": 0.1732087880373001,
219
+ "learning_rate": 0.0008734513274336283,
220
+ "loss": 0.2455,
221
+ "step": 290
222
+ },
223
+ {
224
+ "epoch": 1.3274336283185841,
225
+ "grad_norm": 0.15953731536865234,
226
+ "learning_rate": 0.0008690265486725663,
227
+ "loss": 0.2155,
228
+ "step": 300
229
+ },
230
+ {
231
+ "epoch": 1.3716814159292037,
232
+ "grad_norm": 0.229411318898201,
233
+ "learning_rate": 0.0008646017699115044,
234
+ "loss": 0.2289,
235
+ "step": 310
236
+ },
237
+ {
238
+ "epoch": 1.415929203539823,
239
+ "grad_norm": 0.20390523970127106,
240
+ "learning_rate": 0.0008601769911504425,
241
+ "loss": 0.2429,
242
+ "step": 320
243
+ },
244
+ {
245
+ "epoch": 1.4601769911504425,
246
+ "grad_norm": 0.23142680525779724,
247
+ "learning_rate": 0.0008557522123893805,
248
+ "loss": 0.2291,
249
+ "step": 330
250
+ },
251
+ {
252
+ "epoch": 1.504424778761062,
253
+ "grad_norm": 0.22689059376716614,
254
+ "learning_rate": 0.0008513274336283185,
255
+ "loss": 0.2369,
256
+ "step": 340
257
+ },
258
+ {
259
+ "epoch": 1.5486725663716814,
260
+ "grad_norm": 0.18759772181510925,
261
+ "learning_rate": 0.0008469026548672567,
262
+ "loss": 0.1887,
263
+ "step": 350
264
+ },
265
+ {
266
+ "epoch": 1.592920353982301,
267
+ "grad_norm": 0.17289893329143524,
268
+ "learning_rate": 0.0008424778761061948,
269
+ "loss": 0.2547,
270
+ "step": 360
271
+ },
272
+ {
273
+ "epoch": 1.6371681415929205,
274
+ "grad_norm": 0.20804202556610107,
275
+ "learning_rate": 0.0008380530973451328,
276
+ "loss": 0.2446,
277
+ "step": 370
278
+ },
279
+ {
280
+ "epoch": 1.6814159292035398,
281
+ "grad_norm": 0.2161918580532074,
282
+ "learning_rate": 0.0008336283185840708,
283
+ "loss": 0.2262,
284
+ "step": 380
285
+ },
286
+ {
287
+ "epoch": 1.7256637168141593,
288
+ "grad_norm": 0.27487823367118835,
289
+ "learning_rate": 0.0008292035398230089,
290
+ "loss": 0.2673,
291
+ "step": 390
292
+ },
293
+ {
294
+ "epoch": 1.7699115044247788,
295
+ "grad_norm": 0.20181554555892944,
296
+ "learning_rate": 0.0008247787610619469,
297
+ "loss": 0.252,
298
+ "step": 400
299
+ },
300
+ {
301
+ "epoch": 1.8141592920353982,
302
+ "grad_norm": 0.21222522854804993,
303
+ "learning_rate": 0.000820353982300885,
304
+ "loss": 0.23,
305
+ "step": 410
306
+ },
307
+ {
308
+ "epoch": 1.8584070796460177,
309
+ "grad_norm": 0.21409285068511963,
310
+ "learning_rate": 0.000815929203539823,
311
+ "loss": 0.235,
312
+ "step": 420
313
+ },
314
+ {
315
+ "epoch": 1.9026548672566372,
316
+ "grad_norm": 0.2830056846141815,
317
+ "learning_rate": 0.0008115044247787611,
318
+ "loss": 0.2335,
319
+ "step": 430
320
+ },
321
+ {
322
+ "epoch": 1.9469026548672566,
323
+ "grad_norm": 0.22915257513523102,
324
+ "learning_rate": 0.0008070796460176991,
325
+ "loss": 0.2303,
326
+ "step": 440
327
+ },
328
+ {
329
+ "epoch": 1.991150442477876,
330
+ "grad_norm": 0.19883762300014496,
331
+ "learning_rate": 0.0008026548672566371,
332
+ "loss": 0.2222,
333
+ "step": 450
334
+ },
335
+ {
336
+ "epoch": 2.0,
337
+ "eval_loss": 0.21643634140491486,
338
+ "eval_runtime": 2.7454,
339
+ "eval_samples_per_second": 36.424,
340
+ "eval_steps_per_second": 9.106,
341
+ "step": 452
342
+ },
343
+ {
344
+ "epoch": 2.0353982300884956,
345
+ "grad_norm": 0.2121458351612091,
346
+ "learning_rate": 0.0007982300884955752,
347
+ "loss": 0.2403,
348
+ "step": 460
349
+ },
350
+ {
351
+ "epoch": 2.079646017699115,
352
+ "grad_norm": 0.17018261551856995,
353
+ "learning_rate": 0.0007938053097345133,
354
+ "loss": 0.213,
355
+ "step": 470
356
+ },
357
+ {
358
+ "epoch": 2.1238938053097347,
359
+ "grad_norm": 0.22500459849834442,
360
+ "learning_rate": 0.0007893805309734513,
361
+ "loss": 0.2239,
362
+ "step": 480
363
+ },
364
+ {
365
+ "epoch": 2.168141592920354,
366
+ "grad_norm": 0.19334179162979126,
367
+ "learning_rate": 0.0007849557522123893,
368
+ "loss": 0.2106,
369
+ "step": 490
370
+ },
371
+ {
372
+ "epoch": 2.2123893805309733,
373
+ "grad_norm": 0.1906515508890152,
374
+ "learning_rate": 0.0007805309734513274,
375
+ "loss": 0.2037,
376
+ "step": 500
377
+ },
378
+ {
379
+ "epoch": 2.256637168141593,
380
+ "grad_norm": 0.2478450983762741,
381
+ "learning_rate": 0.0007761061946902656,
382
+ "loss": 0.2164,
383
+ "step": 510
384
+ },
385
+ {
386
+ "epoch": 2.3008849557522124,
387
+ "grad_norm": 0.2270224243402481,
388
+ "learning_rate": 0.0007716814159292036,
389
+ "loss": 0.2253,
390
+ "step": 520
391
+ },
392
+ {
393
+ "epoch": 2.3451327433628317,
394
+ "grad_norm": 0.2539624273777008,
395
+ "learning_rate": 0.0007672566371681416,
396
+ "loss": 0.2016,
397
+ "step": 530
398
+ },
399
+ {
400
+ "epoch": 2.3893805309734515,
401
+ "grad_norm": 0.33118170499801636,
402
+ "learning_rate": 0.0007628318584070797,
403
+ "loss": 0.2239,
404
+ "step": 540
405
+ },
406
+ {
407
+ "epoch": 2.433628318584071,
408
+ "grad_norm": 0.24022382497787476,
409
+ "learning_rate": 0.0007584070796460178,
410
+ "loss": 0.2339,
411
+ "step": 550
412
+ },
413
+ {
414
+ "epoch": 2.47787610619469,
415
+ "grad_norm": 0.22129379212856293,
416
+ "learning_rate": 0.0007539823008849558,
417
+ "loss": 0.2079,
418
+ "step": 560
419
+ },
420
+ {
421
+ "epoch": 2.52212389380531,
422
+ "grad_norm": 0.20302246510982513,
423
+ "learning_rate": 0.0007495575221238938,
424
+ "loss": 0.2012,
425
+ "step": 570
426
+ },
427
+ {
428
+ "epoch": 2.566371681415929,
429
+ "grad_norm": 0.28677117824554443,
430
+ "learning_rate": 0.0007451327433628319,
431
+ "loss": 0.2281,
432
+ "step": 580
433
+ },
434
+ {
435
+ "epoch": 2.6106194690265485,
436
+ "grad_norm": 0.2567579746246338,
437
+ "learning_rate": 0.0007407079646017699,
438
+ "loss": 0.2374,
439
+ "step": 590
440
+ },
441
+ {
442
+ "epoch": 2.6548672566371683,
443
+ "grad_norm": 0.2306365817785263,
444
+ "learning_rate": 0.000736283185840708,
445
+ "loss": 0.2144,
446
+ "step": 600
447
+ },
448
+ {
449
+ "epoch": 2.6991150442477876,
450
+ "grad_norm": 0.23293821513652802,
451
+ "learning_rate": 0.000731858407079646,
452
+ "loss": 0.2381,
453
+ "step": 610
454
+ },
455
+ {
456
+ "epoch": 2.7433628318584073,
457
+ "grad_norm": 0.2173946499824524,
458
+ "learning_rate": 0.0007274336283185841,
459
+ "loss": 0.2155,
460
+ "step": 620
461
+ },
462
+ {
463
+ "epoch": 2.7876106194690267,
464
+ "grad_norm": 0.30976563692092896,
465
+ "learning_rate": 0.0007230088495575221,
466
+ "loss": 0.2262,
467
+ "step": 630
468
+ },
469
+ {
470
+ "epoch": 2.831858407079646,
471
+ "grad_norm": 0.19489358365535736,
472
+ "learning_rate": 0.0007185840707964601,
473
+ "loss": 0.2194,
474
+ "step": 640
475
+ },
476
+ {
477
+ "epoch": 2.8761061946902657,
478
+ "grad_norm": 0.21821223199367523,
479
+ "learning_rate": 0.0007141592920353982,
480
+ "loss": 0.1967,
481
+ "step": 650
482
+ },
483
+ {
484
+ "epoch": 2.920353982300885,
485
+ "grad_norm": 0.23535631597042084,
486
+ "learning_rate": 0.0007097345132743363,
487
+ "loss": 0.2353,
488
+ "step": 660
489
+ },
490
+ {
491
+ "epoch": 2.9646017699115044,
492
+ "grad_norm": 0.20547734200954437,
493
+ "learning_rate": 0.0007053097345132744,
494
+ "loss": 0.2119,
495
+ "step": 670
496
+ },
497
+ {
498
+ "epoch": 3.0,
499
+ "eval_loss": 0.21383462846279144,
500
+ "eval_runtime": 2.6363,
501
+ "eval_samples_per_second": 37.932,
502
+ "eval_steps_per_second": 9.483,
503
+ "step": 678
504
+ },
505
+ {
506
+ "epoch": 3.0088495575221237,
507
+ "grad_norm": 0.21669970452785492,
508
+ "learning_rate": 0.0007008849557522124,
509
+ "loss": 0.2198,
510
+ "step": 680
511
+ },
512
+ {
513
+ "epoch": 3.0530973451327434,
514
+ "grad_norm": 0.20589256286621094,
515
+ "learning_rate": 0.0006964601769911505,
516
+ "loss": 0.2002,
517
+ "step": 690
518
+ },
519
+ {
520
+ "epoch": 3.0973451327433628,
521
+ "grad_norm": 0.23902471363544464,
522
+ "learning_rate": 0.0006920353982300886,
523
+ "loss": 0.1804,
524
+ "step": 700
525
+ },
526
+ {
527
+ "epoch": 3.1415929203539825,
528
+ "grad_norm": 0.2881176173686981,
529
+ "learning_rate": 0.0006876106194690266,
530
+ "loss": 0.2162,
531
+ "step": 710
532
+ },
533
+ {
534
+ "epoch": 3.185840707964602,
535
+ "grad_norm": 0.22364391386508942,
536
+ "learning_rate": 0.0006831858407079646,
537
+ "loss": 0.2185,
538
+ "step": 720
539
+ },
540
+ {
541
+ "epoch": 3.230088495575221,
542
+ "grad_norm": 0.23607216775417328,
543
+ "learning_rate": 0.0006787610619469026,
544
+ "loss": 0.2124,
545
+ "step": 730
546
+ },
547
+ {
548
+ "epoch": 3.274336283185841,
549
+ "grad_norm": 0.18838390707969666,
550
+ "learning_rate": 0.0006743362831858408,
551
+ "loss": 0.179,
552
+ "step": 740
553
+ },
554
+ {
555
+ "epoch": 3.3185840707964602,
556
+ "grad_norm": 0.3451661765575409,
557
+ "learning_rate": 0.0006699115044247788,
558
+ "loss": 0.2135,
559
+ "step": 750
560
+ },
561
+ {
562
+ "epoch": 3.3628318584070795,
563
+ "grad_norm": 0.2281007319688797,
564
+ "learning_rate": 0.0006654867256637168,
565
+ "loss": 0.2071,
566
+ "step": 760
567
+ },
568
+ {
569
+ "epoch": 3.4070796460176993,
570
+ "grad_norm": 0.20740865170955658,
571
+ "learning_rate": 0.0006610619469026548,
572
+ "loss": 0.2081,
573
+ "step": 770
574
+ },
575
+ {
576
+ "epoch": 3.4513274336283186,
577
+ "grad_norm": 0.27458012104034424,
578
+ "learning_rate": 0.0006566371681415929,
579
+ "loss": 0.2026,
580
+ "step": 780
581
+ },
582
+ {
583
+ "epoch": 3.495575221238938,
584
+ "grad_norm": 0.19083356857299805,
585
+ "learning_rate": 0.000652212389380531,
586
+ "loss": 0.1946,
587
+ "step": 790
588
+ },
589
+ {
590
+ "epoch": 3.5398230088495577,
591
+ "grad_norm": 0.2667248845100403,
592
+ "learning_rate": 0.000647787610619469,
593
+ "loss": 0.2141,
594
+ "step": 800
595
+ },
596
+ {
597
+ "epoch": 3.584070796460177,
598
+ "grad_norm": 0.22773493826389313,
599
+ "learning_rate": 0.000643362831858407,
600
+ "loss": 0.2294,
601
+ "step": 810
602
+ },
603
+ {
604
+ "epoch": 3.6283185840707963,
605
+ "grad_norm": 0.24344410002231598,
606
+ "learning_rate": 0.0006389380530973451,
607
+ "loss": 0.1799,
608
+ "step": 820
609
+ },
610
+ {
611
+ "epoch": 3.672566371681416,
612
+ "grad_norm": 0.3232133984565735,
613
+ "learning_rate": 0.0006345132743362833,
614
+ "loss": 0.1807,
615
+ "step": 830
616
+ },
617
+ {
618
+ "epoch": 3.7168141592920354,
619
+ "grad_norm": 0.22465798258781433,
620
+ "learning_rate": 0.0006300884955752213,
621
+ "loss": 0.2005,
622
+ "step": 840
623
+ },
624
+ {
625
+ "epoch": 3.7610619469026547,
626
+ "grad_norm": 0.24152274429798126,
627
+ "learning_rate": 0.0006256637168141594,
628
+ "loss": 0.2001,
629
+ "step": 850
630
+ },
631
+ {
632
+ "epoch": 3.8053097345132745,
633
+ "grad_norm": 0.2764975130558014,
634
+ "learning_rate": 0.0006212389380530974,
635
+ "loss": 0.1691,
636
+ "step": 860
637
+ },
638
+ {
639
+ "epoch": 3.849557522123894,
640
+ "grad_norm": 0.23789626359939575,
641
+ "learning_rate": 0.0006168141592920354,
642
+ "loss": 0.2318,
643
+ "step": 870
644
+ },
645
+ {
646
+ "epoch": 3.893805309734513,
647
+ "grad_norm": 0.21235798299312592,
648
+ "learning_rate": 0.0006123893805309735,
649
+ "loss": 0.1867,
650
+ "step": 880
651
+ },
652
+ {
653
+ "epoch": 3.938053097345133,
654
+ "grad_norm": 0.23083995282649994,
655
+ "learning_rate": 0.0006079646017699116,
656
+ "loss": 0.2135,
657
+ "step": 890
658
+ },
659
+ {
660
+ "epoch": 3.982300884955752,
661
+ "grad_norm": 0.22863389551639557,
662
+ "learning_rate": 0.0006035398230088496,
663
+ "loss": 0.2188,
664
+ "step": 900
665
+ },
666
+ {
667
+ "epoch": 4.0,
668
+ "eval_loss": 0.20991046726703644,
669
+ "eval_runtime": 2.9553,
670
+ "eval_samples_per_second": 33.837,
671
+ "eval_steps_per_second": 8.459,
672
+ "step": 904
673
+ },
674
+ {
675
+ "epoch": 4.0265486725663715,
676
+ "grad_norm": 0.22170217335224152,
677
+ "learning_rate": 0.0005991150442477876,
678
+ "loss": 0.2186,
679
+ "step": 910
680
+ },
681
+ {
682
+ "epoch": 4.070796460176991,
683
+ "grad_norm": 0.2190970778465271,
684
+ "learning_rate": 0.0005946902654867256,
685
+ "loss": 0.1978,
686
+ "step": 920
687
+ },
688
+ {
689
+ "epoch": 4.115044247787611,
690
+ "grad_norm": 0.1924510896205902,
691
+ "learning_rate": 0.0005902654867256638,
692
+ "loss": 0.1787,
693
+ "step": 930
694
+ },
695
+ {
696
+ "epoch": 4.15929203539823,
697
+ "grad_norm": 0.2868868112564087,
698
+ "learning_rate": 0.0005858407079646018,
699
+ "loss": 0.172,
700
+ "step": 940
701
+ },
702
+ {
703
+ "epoch": 4.20353982300885,
704
+ "grad_norm": 0.18888860940933228,
705
+ "learning_rate": 0.0005814159292035398,
706
+ "loss": 0.1761,
707
+ "step": 950
708
+ },
709
+ {
710
+ "epoch": 4.247787610619469,
711
+ "grad_norm": 0.21858586370944977,
712
+ "learning_rate": 0.0005769911504424778,
713
+ "loss": 0.1871,
714
+ "step": 960
715
+ },
716
+ {
717
+ "epoch": 4.292035398230088,
718
+ "grad_norm": 0.305698961019516,
719
+ "learning_rate": 0.0005725663716814159,
720
+ "loss": 0.1886,
721
+ "step": 970
722
+ },
723
+ {
724
+ "epoch": 4.336283185840708,
725
+ "grad_norm": 0.23597249388694763,
726
+ "learning_rate": 0.000568141592920354,
727
+ "loss": 0.1865,
728
+ "step": 980
729
+ },
730
+ {
731
+ "epoch": 4.380530973451328,
732
+ "grad_norm": 0.271823912858963,
733
+ "learning_rate": 0.0005637168141592921,
734
+ "loss": 0.1709,
735
+ "step": 990
736
+ },
737
+ {
738
+ "epoch": 4.424778761061947,
739
+ "grad_norm": 0.19630669057369232,
740
+ "learning_rate": 0.0005592920353982301,
741
+ "loss": 0.2429,
742
+ "step": 1000
743
+ },
744
+ {
745
+ "epoch": 4.469026548672566,
746
+ "grad_norm": 0.29825878143310547,
747
+ "learning_rate": 0.0005548672566371682,
748
+ "loss": 0.1879,
749
+ "step": 1010
750
+ },
751
+ {
752
+ "epoch": 4.513274336283186,
753
+ "grad_norm": 0.21552462875843048,
754
+ "learning_rate": 0.0005504424778761063,
755
+ "loss": 0.1905,
756
+ "step": 1020
757
+ },
758
+ {
759
+ "epoch": 4.557522123893805,
760
+ "grad_norm": 0.28668805956840515,
761
+ "learning_rate": 0.0005460176991150443,
762
+ "loss": 0.1951,
763
+ "step": 1030
764
+ },
765
+ {
766
+ "epoch": 4.601769911504425,
767
+ "grad_norm": 0.27180853486061096,
768
+ "learning_rate": 0.0005415929203539823,
769
+ "loss": 0.1758,
770
+ "step": 1040
771
+ },
772
+ {
773
+ "epoch": 4.646017699115045,
774
+ "grad_norm": 0.3072490394115448,
775
+ "learning_rate": 0.0005371681415929204,
776
+ "loss": 0.1852,
777
+ "step": 1050
778
+ },
779
+ {
780
+ "epoch": 4.6902654867256635,
781
+ "grad_norm": 0.2913398742675781,
782
+ "learning_rate": 0.0005327433628318584,
783
+ "loss": 0.201,
784
+ "step": 1060
785
+ },
786
+ {
787
+ "epoch": 4.734513274336283,
788
+ "grad_norm": 0.29055866599082947,
789
+ "learning_rate": 0.0005283185840707965,
790
+ "loss": 0.1932,
791
+ "step": 1070
792
+ },
793
+ {
794
+ "epoch": 4.778761061946903,
795
+ "grad_norm": 0.2742849290370941,
796
+ "learning_rate": 0.0005238938053097345,
797
+ "loss": 0.183,
798
+ "step": 1080
799
+ },
800
+ {
801
+ "epoch": 4.823008849557522,
802
+ "grad_norm": 0.2370535433292389,
803
+ "learning_rate": 0.0005194690265486726,
804
+ "loss": 0.1849,
805
+ "step": 1090
806
+ },
807
+ {
808
+ "epoch": 4.867256637168142,
809
+ "grad_norm": 0.31343671679496765,
810
+ "learning_rate": 0.0005150442477876106,
811
+ "loss": 0.2195,
812
+ "step": 1100
813
+ },
814
+ {
815
+ "epoch": 4.911504424778761,
816
+ "grad_norm": 0.3136596381664276,
817
+ "learning_rate": 0.0005106194690265486,
818
+ "loss": 0.1907,
819
+ "step": 1110
820
+ },
821
+ {
822
+ "epoch": 4.95575221238938,
823
+ "grad_norm": 0.2071835845708847,
824
+ "learning_rate": 0.0005061946902654867,
825
+ "loss": 0.1969,
826
+ "step": 1120
827
+ },
828
+ {
829
+ "epoch": 5.0,
830
+ "grad_norm": 0.25057336688041687,
831
+ "learning_rate": 0.0005017699115044248,
832
+ "loss": 0.1916,
833
+ "step": 1130
834
+ },
835
+ {
836
+ "epoch": 5.0,
837
+ "eval_loss": 0.21029528975486755,
838
+ "eval_runtime": 2.628,
839
+ "eval_samples_per_second": 38.052,
840
+ "eval_steps_per_second": 9.513,
841
+ "step": 1130
842
+ },
843
+ {
844
+ "epoch": 5.04424778761062,
845
+ "grad_norm": 0.21927224099636078,
846
+ "learning_rate": 0.0004973451327433628,
847
+ "loss": 0.155,
848
+ "step": 1140
849
+ },
850
+ {
851
+ "epoch": 5.088495575221239,
852
+ "grad_norm": 0.3175056576728821,
853
+ "learning_rate": 0.0004929203539823009,
854
+ "loss": 0.189,
855
+ "step": 1150
856
+ },
857
+ {
858
+ "epoch": 5.132743362831858,
859
+ "grad_norm": 0.2786344587802887,
860
+ "learning_rate": 0.0004884955752212389,
861
+ "loss": 0.1679,
862
+ "step": 1160
863
+ },
864
+ {
865
+ "epoch": 5.176991150442478,
866
+ "grad_norm": 0.2475520521402359,
867
+ "learning_rate": 0.00048407079646017696,
868
+ "loss": 0.1855,
869
+ "step": 1170
870
+ },
871
+ {
872
+ "epoch": 5.221238938053097,
873
+ "grad_norm": 0.24603202939033508,
874
+ "learning_rate": 0.00047964601769911504,
875
+ "loss": 0.1755,
876
+ "step": 1180
877
+ },
878
+ {
879
+ "epoch": 5.265486725663717,
880
+ "grad_norm": 0.26339662075042725,
881
+ "learning_rate": 0.00047522123893805305,
882
+ "loss": 0.1644,
883
+ "step": 1190
884
+ },
885
+ {
886
+ "epoch": 5.3097345132743365,
887
+ "grad_norm": 0.20065292716026306,
888
+ "learning_rate": 0.0004707964601769912,
889
+ "loss": 0.1555,
890
+ "step": 1200
891
+ },
892
+ {
893
+ "epoch": 5.353982300884955,
894
+ "grad_norm": 0.34847521781921387,
895
+ "learning_rate": 0.00046637168141592925,
896
+ "loss": 0.1644,
897
+ "step": 1210
898
+ },
899
+ {
900
+ "epoch": 5.398230088495575,
901
+ "grad_norm": 0.41893231868743896,
902
+ "learning_rate": 0.00046194690265486727,
903
+ "loss": 0.1661,
904
+ "step": 1220
905
+ },
906
+ {
907
+ "epoch": 5.442477876106195,
908
+ "grad_norm": 0.2889445424079895,
909
+ "learning_rate": 0.00045752212389380535,
910
+ "loss": 0.1924,
911
+ "step": 1230
912
+ },
913
+ {
914
+ "epoch": 5.486725663716814,
915
+ "grad_norm": 0.24809350073337555,
916
+ "learning_rate": 0.00045309734513274336,
917
+ "loss": 0.1941,
918
+ "step": 1240
919
+ },
920
+ {
921
+ "epoch": 5.530973451327434,
922
+ "grad_norm": 0.27125945687294006,
923
+ "learning_rate": 0.00044867256637168144,
924
+ "loss": 0.1731,
925
+ "step": 1250
926
+ },
927
+ {
928
+ "epoch": 5.575221238938053,
929
+ "grad_norm": 0.3384355902671814,
930
+ "learning_rate": 0.00044424778761061946,
931
+ "loss": 0.164,
932
+ "step": 1260
933
+ },
934
+ {
935
+ "epoch": 5.619469026548672,
936
+ "grad_norm": 0.3089454174041748,
937
+ "learning_rate": 0.00043982300884955753,
938
+ "loss": 0.1823,
939
+ "step": 1270
940
+ },
941
+ {
942
+ "epoch": 5.663716814159292,
943
+ "grad_norm": 0.26540765166282654,
944
+ "learning_rate": 0.0004353982300884956,
945
+ "loss": 0.1762,
946
+ "step": 1280
947
+ },
948
+ {
949
+ "epoch": 5.707964601769912,
950
+ "grad_norm": 0.22383682429790497,
951
+ "learning_rate": 0.0004309734513274337,
952
+ "loss": 0.2063,
953
+ "step": 1290
954
+ },
955
+ {
956
+ "epoch": 5.752212389380531,
957
+ "grad_norm": 0.24541282653808594,
958
+ "learning_rate": 0.0004265486725663717,
959
+ "loss": 0.1799,
960
+ "step": 1300
961
+ },
962
+ {
963
+ "epoch": 5.79646017699115,
964
+ "grad_norm": 0.33302921056747437,
965
+ "learning_rate": 0.00042212389380530976,
966
+ "loss": 0.1749,
967
+ "step": 1310
968
+ },
969
+ {
970
+ "epoch": 5.84070796460177,
971
+ "grad_norm": 0.274087131023407,
972
+ "learning_rate": 0.0004176991150442478,
973
+ "loss": 0.1982,
974
+ "step": 1320
975
+ },
976
+ {
977
+ "epoch": 5.88495575221239,
978
+ "grad_norm": 0.3344975411891937,
979
+ "learning_rate": 0.00041327433628318586,
980
+ "loss": 0.1962,
981
+ "step": 1330
982
+ },
983
+ {
984
+ "epoch": 5.929203539823009,
985
+ "grad_norm": 0.28589603304862976,
986
+ "learning_rate": 0.0004088495575221239,
987
+ "loss": 0.2078,
988
+ "step": 1340
989
+ },
990
+ {
991
+ "epoch": 5.9734513274336285,
992
+ "grad_norm": 0.18417391180992126,
993
+ "learning_rate": 0.00040442477876106195,
994
+ "loss": 0.1806,
995
+ "step": 1350
996
+ },
997
+ {
998
+ "epoch": 6.0,
999
+ "eval_loss": 0.20804466307163239,
1000
+ "eval_runtime": 2.6659,
1001
+ "eval_samples_per_second": 37.511,
1002
+ "eval_steps_per_second": 9.378,
1003
+ "step": 1356
1004
+ },
1005
+ {
1006
+ "epoch": 6.017699115044247,
1007
+ "grad_norm": 0.24382148683071136,
1008
+ "learning_rate": 0.0004,
1009
+ "loss": 0.1675,
1010
+ "step": 1360
1011
+ },
1012
+ {
1013
+ "epoch": 6.061946902654867,
1014
+ "grad_norm": 0.2718934714794159,
1015
+ "learning_rate": 0.0003955752212389381,
1016
+ "loss": 0.1546,
1017
+ "step": 1370
1018
+ },
1019
+ {
1020
+ "epoch": 6.106194690265487,
1021
+ "grad_norm": 0.321180135011673,
1022
+ "learning_rate": 0.0003911504424778761,
1023
+ "loss": 0.1828,
1024
+ "step": 1380
1025
+ },
1026
+ {
1027
+ "epoch": 6.150442477876107,
1028
+ "grad_norm": 0.31438615918159485,
1029
+ "learning_rate": 0.0003867256637168142,
1030
+ "loss": 0.1793,
1031
+ "step": 1390
1032
+ },
1033
+ {
1034
+ "epoch": 6.1946902654867255,
1035
+ "grad_norm": 0.24199295043945312,
1036
+ "learning_rate": 0.0003823008849557522,
1037
+ "loss": 0.1627,
1038
+ "step": 1400
1039
+ },
1040
+ {
1041
+ "epoch": 6.238938053097345,
1042
+ "grad_norm": 0.3219399154186249,
1043
+ "learning_rate": 0.0003778761061946903,
1044
+ "loss": 0.1557,
1045
+ "step": 1410
1046
+ },
1047
+ {
1048
+ "epoch": 6.283185840707965,
1049
+ "grad_norm": 0.20730754733085632,
1050
+ "learning_rate": 0.0003734513274336283,
1051
+ "loss": 0.1728,
1052
+ "step": 1420
1053
+ },
1054
+ {
1055
+ "epoch": 6.327433628318584,
1056
+ "grad_norm": 0.30667644739151,
1057
+ "learning_rate": 0.00036902654867256637,
1058
+ "loss": 0.1601,
1059
+ "step": 1430
1060
+ },
1061
+ {
1062
+ "epoch": 6.371681415929204,
1063
+ "grad_norm": 0.364202082157135,
1064
+ "learning_rate": 0.00036460176991150444,
1065
+ "loss": 0.166,
1066
+ "step": 1440
1067
+ },
1068
+ {
1069
+ "epoch": 6.415929203539823,
1070
+ "grad_norm": 0.2910124659538269,
1071
+ "learning_rate": 0.0003601769911504425,
1072
+ "loss": 0.18,
1073
+ "step": 1450
1074
+ },
1075
+ {
1076
+ "epoch": 6.460176991150442,
1077
+ "grad_norm": 0.3251543939113617,
1078
+ "learning_rate": 0.00035575221238938053,
1079
+ "loss": 0.1666,
1080
+ "step": 1460
1081
+ },
1082
+ {
1083
+ "epoch": 6.504424778761062,
1084
+ "grad_norm": 0.31853803992271423,
1085
+ "learning_rate": 0.0003513274336283186,
1086
+ "loss": 0.1683,
1087
+ "step": 1470
1088
+ },
1089
+ {
1090
+ "epoch": 6.548672566371682,
1091
+ "grad_norm": 0.3730286657810211,
1092
+ "learning_rate": 0.0003469026548672566,
1093
+ "loss": 0.163,
1094
+ "step": 1480
1095
+ },
1096
+ {
1097
+ "epoch": 6.592920353982301,
1098
+ "grad_norm": 0.3070693910121918,
1099
+ "learning_rate": 0.0003424778761061947,
1100
+ "loss": 0.1492,
1101
+ "step": 1490
1102
+ },
1103
+ {
1104
+ "epoch": 6.6371681415929205,
1105
+ "grad_norm": 0.25525256991386414,
1106
+ "learning_rate": 0.0003380530973451327,
1107
+ "loss": 0.1587,
1108
+ "step": 1500
1109
+ },
1110
+ {
1111
+ "epoch": 6.68141592920354,
1112
+ "grad_norm": 0.34361934661865234,
1113
+ "learning_rate": 0.0003336283185840708,
1114
+ "loss": 0.161,
1115
+ "step": 1510
1116
+ },
1117
+ {
1118
+ "epoch": 6.725663716814159,
1119
+ "grad_norm": 0.2400776594877243,
1120
+ "learning_rate": 0.00032920353982300886,
1121
+ "loss": 0.1534,
1122
+ "step": 1520
1123
+ },
1124
+ {
1125
+ "epoch": 6.769911504424779,
1126
+ "grad_norm": 0.3599693477153778,
1127
+ "learning_rate": 0.00032477876106194693,
1128
+ "loss": 0.1699,
1129
+ "step": 1530
1130
+ },
1131
+ {
1132
+ "epoch": 6.814159292035399,
1133
+ "grad_norm": 0.26774442195892334,
1134
+ "learning_rate": 0.00032035398230088495,
1135
+ "loss": 0.1567,
1136
+ "step": 1540
1137
+ },
1138
+ {
1139
+ "epoch": 6.8584070796460175,
1140
+ "grad_norm": 0.32396429777145386,
1141
+ "learning_rate": 0.000315929203539823,
1142
+ "loss": 0.1929,
1143
+ "step": 1550
1144
+ },
1145
+ {
1146
+ "epoch": 6.902654867256637,
1147
+ "grad_norm": 0.3491114377975464,
1148
+ "learning_rate": 0.00031150442477876104,
1149
+ "loss": 0.1784,
1150
+ "step": 1560
1151
+ },
1152
+ {
1153
+ "epoch": 6.946902654867257,
1154
+ "grad_norm": 0.372086763381958,
1155
+ "learning_rate": 0.0003070796460176991,
1156
+ "loss": 0.193,
1157
+ "step": 1570
1158
+ },
1159
+ {
1160
+ "epoch": 6.991150442477876,
1161
+ "grad_norm": 0.2936050593852997,
1162
+ "learning_rate": 0.00030265486725663713,
1163
+ "loss": 0.1899,
1164
+ "step": 1580
1165
+ },
1166
+ {
1167
+ "epoch": 7.0,
1168
+ "eval_loss": 0.20992980897426605,
1169
+ "eval_runtime": 3.174,
1170
+ "eval_samples_per_second": 31.506,
1171
+ "eval_steps_per_second": 7.877,
1172
+ "step": 1582
1173
+ },
1174
+ {
1175
+ "epoch": 7.035398230088496,
1176
+ "grad_norm": 0.3688855767250061,
1177
+ "learning_rate": 0.0002982300884955752,
1178
+ "loss": 0.1813,
1179
+ "step": 1590
1180
+ },
1181
+ {
1182
+ "epoch": 7.079646017699115,
1183
+ "grad_norm": 0.32831940054893494,
1184
+ "learning_rate": 0.00029380530973451333,
1185
+ "loss": 0.1472,
1186
+ "step": 1600
1187
+ },
1188
+ {
1189
+ "epoch": 7.123893805309734,
1190
+ "grad_norm": 0.32714003324508667,
1191
+ "learning_rate": 0.00028938053097345135,
1192
+ "loss": 0.1704,
1193
+ "step": 1610
1194
+ },
1195
+ {
1196
+ "epoch": 7.168141592920354,
1197
+ "grad_norm": 0.49076274037361145,
1198
+ "learning_rate": 0.0002849557522123894,
1199
+ "loss": 0.1559,
1200
+ "step": 1620
1201
+ },
1202
+ {
1203
+ "epoch": 7.212389380530974,
1204
+ "grad_norm": 0.2076297253370285,
1205
+ "learning_rate": 0.00028053097345132744,
1206
+ "loss": 0.1571,
1207
+ "step": 1630
1208
+ },
1209
+ {
1210
+ "epoch": 7.256637168141593,
1211
+ "grad_norm": 0.30924052000045776,
1212
+ "learning_rate": 0.0002761061946902655,
1213
+ "loss": 0.1497,
1214
+ "step": 1640
1215
+ },
1216
+ {
1217
+ "epoch": 7.300884955752212,
1218
+ "grad_norm": 0.29587677121162415,
1219
+ "learning_rate": 0.00027168141592920353,
1220
+ "loss": 0.1506,
1221
+ "step": 1650
1222
+ },
1223
+ {
1224
+ "epoch": 7.345132743362832,
1225
+ "grad_norm": 0.339077889919281,
1226
+ "learning_rate": 0.0002672566371681416,
1227
+ "loss": 0.152,
1228
+ "step": 1660
1229
+ },
1230
+ {
1231
+ "epoch": 7.389380530973451,
1232
+ "grad_norm": 0.2390238344669342,
1233
+ "learning_rate": 0.0002628318584070796,
1234
+ "loss": 0.1634,
1235
+ "step": 1670
1236
+ },
1237
+ {
1238
+ "epoch": 7.433628318584071,
1239
+ "grad_norm": 0.3401966392993927,
1240
+ "learning_rate": 0.00025840707964601775,
1241
+ "loss": 0.1437,
1242
+ "step": 1680
1243
+ },
1244
+ {
1245
+ "epoch": 7.477876106194691,
1246
+ "grad_norm": 0.3273468017578125,
1247
+ "learning_rate": 0.00025398230088495577,
1248
+ "loss": 0.1421,
1249
+ "step": 1690
1250
+ },
1251
+ {
1252
+ "epoch": 7.522123893805309,
1253
+ "grad_norm": 0.2576355040073395,
1254
+ "learning_rate": 0.00024955752212389384,
1255
+ "loss": 0.1606,
1256
+ "step": 1700
1257
+ },
1258
+ {
1259
+ "epoch": 7.566371681415929,
1260
+ "grad_norm": 0.3079942464828491,
1261
+ "learning_rate": 0.00024513274336283186,
1262
+ "loss": 0.1662,
1263
+ "step": 1710
1264
+ },
1265
+ {
1266
+ "epoch": 7.610619469026549,
1267
+ "grad_norm": 0.35095077753067017,
1268
+ "learning_rate": 0.0002407079646017699,
1269
+ "loss": 0.1449,
1270
+ "step": 1720
1271
+ },
1272
+ {
1273
+ "epoch": 7.654867256637168,
1274
+ "grad_norm": 0.2713673412799835,
1275
+ "learning_rate": 0.00023628318584070795,
1276
+ "loss": 0.1666,
1277
+ "step": 1730
1278
+ },
1279
+ {
1280
+ "epoch": 7.699115044247788,
1281
+ "grad_norm": 0.3343076705932617,
1282
+ "learning_rate": 0.00023185840707964602,
1283
+ "loss": 0.1657,
1284
+ "step": 1740
1285
+ },
1286
+ {
1287
+ "epoch": 7.743362831858407,
1288
+ "grad_norm": 0.27280741930007935,
1289
+ "learning_rate": 0.00022743362831858407,
1290
+ "loss": 0.1584,
1291
+ "step": 1750
1292
+ },
1293
+ {
1294
+ "epoch": 7.787610619469026,
1295
+ "grad_norm": 0.3658842146396637,
1296
+ "learning_rate": 0.0002230088495575221,
1297
+ "loss": 0.178,
1298
+ "step": 1760
1299
+ },
1300
+ {
1301
+ "epoch": 7.831858407079646,
1302
+ "grad_norm": 0.2327466607093811,
1303
+ "learning_rate": 0.00021858407079646016,
1304
+ "loss": 0.1394,
1305
+ "step": 1770
1306
+ },
1307
+ {
1308
+ "epoch": 7.876106194690266,
1309
+ "grad_norm": 0.2981870174407959,
1310
+ "learning_rate": 0.00021415929203539826,
1311
+ "loss": 0.1555,
1312
+ "step": 1780
1313
+ },
1314
+ {
1315
+ "epoch": 7.920353982300885,
1316
+ "grad_norm": 0.32251453399658203,
1317
+ "learning_rate": 0.0002097345132743363,
1318
+ "loss": 0.1817,
1319
+ "step": 1790
1320
+ },
1321
+ {
1322
+ "epoch": 7.964601769911504,
1323
+ "grad_norm": 0.34020307660102844,
1324
+ "learning_rate": 0.00020530973451327435,
1325
+ "loss": 0.1667,
1326
+ "step": 1800
1327
+ },
1328
+ {
1329
+ "epoch": 8.0,
1330
+ "eval_loss": 0.2127797156572342,
1331
+ "eval_runtime": 2.6346,
1332
+ "eval_samples_per_second": 37.957,
1333
+ "eval_steps_per_second": 9.489,
1334
+ "step": 1808
1335
+ },
1336
+ {
1337
+ "epoch": 8.008849557522124,
1338
+ "grad_norm": 0.2688687741756439,
1339
+ "learning_rate": 0.0002008849557522124,
1340
+ "loss": 0.1726,
1341
+ "step": 1810
1342
+ },
1343
+ {
1344
+ "epoch": 8.053097345132743,
1345
+ "grad_norm": 0.26508933305740356,
1346
+ "learning_rate": 0.00019646017699115047,
1347
+ "loss": 0.1573,
1348
+ "step": 1820
1349
+ },
1350
+ {
1351
+ "epoch": 8.097345132743364,
1352
+ "grad_norm": 0.38828426599502563,
1353
+ "learning_rate": 0.0001920353982300885,
1354
+ "loss": 0.1593,
1355
+ "step": 1830
1356
+ },
1357
+ {
1358
+ "epoch": 8.141592920353983,
1359
+ "grad_norm": 0.28579315543174744,
1360
+ "learning_rate": 0.00018761061946902656,
1361
+ "loss": 0.139,
1362
+ "step": 1840
1363
+ },
1364
+ {
1365
+ "epoch": 8.185840707964601,
1366
+ "grad_norm": 0.29282671213150024,
1367
+ "learning_rate": 0.0001831858407079646,
1368
+ "loss": 0.1576,
1369
+ "step": 1850
1370
+ },
1371
+ {
1372
+ "epoch": 8.230088495575222,
1373
+ "grad_norm": 0.39632460474967957,
1374
+ "learning_rate": 0.00017876106194690268,
1375
+ "loss": 0.1599,
1376
+ "step": 1860
1377
+ },
1378
+ {
1379
+ "epoch": 8.274336283185841,
1380
+ "grad_norm": 0.8853453993797302,
1381
+ "learning_rate": 0.00017433628318584072,
1382
+ "loss": 0.1415,
1383
+ "step": 1870
1384
+ },
1385
+ {
1386
+ "epoch": 8.31858407079646,
1387
+ "grad_norm": 0.28350165486335754,
1388
+ "learning_rate": 0.00016991150442477877,
1389
+ "loss": 0.1601,
1390
+ "step": 1880
1391
+ },
1392
+ {
1393
+ "epoch": 8.36283185840708,
1394
+ "grad_norm": 0.32908403873443604,
1395
+ "learning_rate": 0.00016548672566371681,
1396
+ "loss": 0.1502,
1397
+ "step": 1890
1398
+ },
1399
+ {
1400
+ "epoch": 8.4070796460177,
1401
+ "grad_norm": 0.26707422733306885,
1402
+ "learning_rate": 0.0001610619469026549,
1403
+ "loss": 0.144,
1404
+ "step": 1900
1405
+ },
1406
+ {
1407
+ "epoch": 8.451327433628318,
1408
+ "grad_norm": 0.2607186436653137,
1409
+ "learning_rate": 0.00015663716814159293,
1410
+ "loss": 0.1497,
1411
+ "step": 1910
1412
+ },
1413
+ {
1414
+ "epoch": 8.495575221238939,
1415
+ "grad_norm": 0.3008362650871277,
1416
+ "learning_rate": 0.00015221238938053098,
1417
+ "loss": 0.1519,
1418
+ "step": 1920
1419
+ },
1420
+ {
1421
+ "epoch": 8.539823008849558,
1422
+ "grad_norm": 0.3770766854286194,
1423
+ "learning_rate": 0.00014778761061946902,
1424
+ "loss": 0.1486,
1425
+ "step": 1930
1426
+ },
1427
+ {
1428
+ "epoch": 8.584070796460177,
1429
+ "grad_norm": 0.24154478311538696,
1430
+ "learning_rate": 0.0001433628318584071,
1431
+ "loss": 0.1504,
1432
+ "step": 1940
1433
+ },
1434
+ {
1435
+ "epoch": 8.628318584070797,
1436
+ "grad_norm": 0.28921449184417725,
1437
+ "learning_rate": 0.00013893805309734514,
1438
+ "loss": 0.1636,
1439
+ "step": 1950
1440
+ },
1441
+ {
1442
+ "epoch": 8.672566371681416,
1443
+ "grad_norm": 0.32194775342941284,
1444
+ "learning_rate": 0.0001345132743362832,
1445
+ "loss": 0.1746,
1446
+ "step": 1960
1447
+ },
1448
+ {
1449
+ "epoch": 8.716814159292035,
1450
+ "grad_norm": 0.2882642149925232,
1451
+ "learning_rate": 0.00013008849557522123,
1452
+ "loss": 0.1305,
1453
+ "step": 1970
1454
+ },
1455
+ {
1456
+ "epoch": 8.761061946902656,
1457
+ "grad_norm": 0.30995509028434753,
1458
+ "learning_rate": 0.0001256637168141593,
1459
+ "loss": 0.1484,
1460
+ "step": 1980
1461
+ },
1462
+ {
1463
+ "epoch": 8.805309734513274,
1464
+ "grad_norm": 0.32381975650787354,
1465
+ "learning_rate": 0.00012123893805309735,
1466
+ "loss": 0.1657,
1467
+ "step": 1990
1468
+ },
1469
+ {
1470
+ "epoch": 8.849557522123893,
1471
+ "grad_norm": 0.22391530871391296,
1472
+ "learning_rate": 0.0001168141592920354,
1473
+ "loss": 0.1247,
1474
+ "step": 2000
1475
+ },
1476
+ {
1477
+ "epoch": 8.893805309734514,
1478
+ "grad_norm": 0.23185725510120392,
1479
+ "learning_rate": 0.00011238938053097346,
1480
+ "loss": 0.153,
1481
+ "step": 2010
1482
+ },
1483
+ {
1484
+ "epoch": 8.938053097345133,
1485
+ "grad_norm": 0.27952226996421814,
1486
+ "learning_rate": 0.0001079646017699115,
1487
+ "loss": 0.1621,
1488
+ "step": 2020
1489
+ },
1490
+ {
1491
+ "epoch": 8.982300884955752,
1492
+ "grad_norm": 0.2538679540157318,
1493
+ "learning_rate": 0.00010353982300884956,
1494
+ "loss": 0.1392,
1495
+ "step": 2030
1496
+ },
1497
+ {
1498
+ "epoch": 9.0,
1499
+ "eval_loss": 0.21310940384864807,
1500
+ "eval_runtime": 2.6327,
1501
+ "eval_samples_per_second": 37.984,
1502
+ "eval_steps_per_second": 9.496,
1503
+ "step": 2034
1504
+ },
1505
+ {
1506
+ "epoch": 9.026548672566372,
1507
+ "grad_norm": 0.2921323776245117,
1508
+ "learning_rate": 9.91150442477876e-05,
1509
+ "loss": 0.1549,
1510
+ "step": 2040
1511
+ },
1512
+ {
1513
+ "epoch": 9.070796460176991,
1514
+ "grad_norm": 0.2572889029979706,
1515
+ "learning_rate": 9.469026548672566e-05,
1516
+ "loss": 0.1734,
1517
+ "step": 2050
1518
+ },
1519
+ {
1520
+ "epoch": 9.11504424778761,
1521
+ "grad_norm": 0.2991015613079071,
1522
+ "learning_rate": 9.026548672566372e-05,
1523
+ "loss": 0.1582,
1524
+ "step": 2060
1525
+ },
1526
+ {
1527
+ "epoch": 9.15929203539823,
1528
+ "grad_norm": 0.33754679560661316,
1529
+ "learning_rate": 8.584070796460178e-05,
1530
+ "loss": 0.1343,
1531
+ "step": 2070
1532
+ },
1533
+ {
1534
+ "epoch": 9.20353982300885,
1535
+ "grad_norm": 0.2426099181175232,
1536
+ "learning_rate": 8.141592920353983e-05,
1537
+ "loss": 0.1462,
1538
+ "step": 2080
1539
+ },
1540
+ {
1541
+ "epoch": 9.247787610619469,
1542
+ "grad_norm": 0.3596532344818115,
1543
+ "learning_rate": 7.699115044247789e-05,
1544
+ "loss": 0.1522,
1545
+ "step": 2090
1546
+ },
1547
+ {
1548
+ "epoch": 9.29203539823009,
1549
+ "grad_norm": 0.22559010982513428,
1550
+ "learning_rate": 7.256637168141593e-05,
1551
+ "loss": 0.1292,
1552
+ "step": 2100
1553
+ },
1554
+ {
1555
+ "epoch": 9.336283185840708,
1556
+ "grad_norm": 0.3877250850200653,
1557
+ "learning_rate": 6.814159292035399e-05,
1558
+ "loss": 0.1257,
1559
+ "step": 2110
1560
+ },
1561
+ {
1562
+ "epoch": 9.380530973451327,
1563
+ "grad_norm": 0.3135465383529663,
1564
+ "learning_rate": 6.371681415929204e-05,
1565
+ "loss": 0.1508,
1566
+ "step": 2120
1567
+ },
1568
+ {
1569
+ "epoch": 9.424778761061948,
1570
+ "grad_norm": 0.3448950946331024,
1571
+ "learning_rate": 5.929203539823009e-05,
1572
+ "loss": 0.1386,
1573
+ "step": 2130
1574
+ },
1575
+ {
1576
+ "epoch": 9.469026548672566,
1577
+ "grad_norm": 0.2957702577114105,
1578
+ "learning_rate": 5.486725663716814e-05,
1579
+ "loss": 0.1456,
1580
+ "step": 2140
1581
+ },
1582
+ {
1583
+ "epoch": 9.513274336283185,
1584
+ "grad_norm": 0.2347142994403839,
1585
+ "learning_rate": 5.0442477876106195e-05,
1586
+ "loss": 0.1476,
1587
+ "step": 2150
1588
+ },
1589
+ {
1590
+ "epoch": 9.557522123893806,
1591
+ "grad_norm": 0.3887890577316284,
1592
+ "learning_rate": 4.601769911504425e-05,
1593
+ "loss": 0.158,
1594
+ "step": 2160
1595
+ },
1596
+ {
1597
+ "epoch": 9.601769911504425,
1598
+ "grad_norm": 0.2899017632007599,
1599
+ "learning_rate": 4.15929203539823e-05,
1600
+ "loss": 0.1323,
1601
+ "step": 2170
1602
+ },
1603
+ {
1604
+ "epoch": 9.646017699115044,
1605
+ "grad_norm": 0.37858498096466064,
1606
+ "learning_rate": 3.716814159292035e-05,
1607
+ "loss": 0.1488,
1608
+ "step": 2180
1609
+ },
1610
+ {
1611
+ "epoch": 9.690265486725664,
1612
+ "grad_norm": 0.30040085315704346,
1613
+ "learning_rate": 3.2743362831858405e-05,
1614
+ "loss": 0.1453,
1615
+ "step": 2190
1616
+ },
1617
+ {
1618
+ "epoch": 9.734513274336283,
1619
+ "grad_norm": 0.34911859035491943,
1620
+ "learning_rate": 2.831858407079646e-05,
1621
+ "loss": 0.1578,
1622
+ "step": 2200
1623
+ },
1624
+ {
1625
+ "epoch": 9.778761061946902,
1626
+ "grad_norm": 0.3793705999851227,
1627
+ "learning_rate": 2.3893805309734513e-05,
1628
+ "loss": 0.1551,
1629
+ "step": 2210
1630
+ },
1631
+ {
1632
+ "epoch": 9.823008849557523,
1633
+ "grad_norm": 0.3259049654006958,
1634
+ "learning_rate": 1.9469026548672565e-05,
1635
+ "loss": 0.1782,
1636
+ "step": 2220
1637
+ },
1638
+ {
1639
+ "epoch": 9.867256637168142,
1640
+ "grad_norm": 0.2592504620552063,
1641
+ "learning_rate": 1.5044247787610619e-05,
1642
+ "loss": 0.1488,
1643
+ "step": 2230
1644
+ },
1645
+ {
1646
+ "epoch": 9.91150442477876,
1647
+ "grad_norm": 0.26316604018211365,
1648
+ "learning_rate": 1.0619469026548673e-05,
1649
+ "loss": 0.1328,
1650
+ "step": 2240
1651
+ },
1652
+ {
1653
+ "epoch": 9.955752212389381,
1654
+ "grad_norm": 0.34197258949279785,
1655
+ "learning_rate": 6.194690265486725e-06,
1656
+ "loss": 0.1658,
1657
+ "step": 2250
1658
+ },
1659
+ {
1660
+ "epoch": 10.0,
1661
+ "grad_norm": 0.281561017036438,
1662
+ "learning_rate": 1.7699115044247788e-06,
1663
+ "loss": 0.1256,
1664
+ "step": 2260
1665
+ }
1666
+ ],
1667
+ "logging_steps": 10,
1668
+ "max_steps": 2260,
1669
+ "num_input_tokens_seen": 0,
1670
+ "num_train_epochs": 10,
1671
+ "save_steps": 500,
1672
+ "stateful_callbacks": {
1673
+ "TrainerControl": {
1674
+ "args": {
1675
+ "should_epoch_stop": false,
1676
+ "should_evaluate": false,
1677
+ "should_log": false,
1678
+ "should_save": true,
1679
+ "should_training_stop": true
1680
+ },
1681
+ "attributes": {}
1682
+ }
1683
+ },
1684
+ "total_flos": 5529549227950080.0,
1685
+ "train_batch_size": 4,
1686
+ "trial_name": null,
1687
+ "trial_params": null
1688
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ee112143f1e982d59ff6dd5474169dfedc5a6fb323922bdd61940842518df8f6
3
+ size 5496