Kyle1668 commited on
Commit
353b390
·
verified ·
1 Parent(s): 937e77a

End of training

Browse files
README.md ADDED
@@ -0,0 +1,455 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: allenai/scibert_scivocab_uncased
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: allenai-scibert_scivocab_uncased_20241230-091934
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # allenai-scibert_scivocab_uncased_20241230-091934
15
+
16
+ This model is a fine-tuned version of [allenai/scibert_scivocab_uncased](https://huggingface.co/allenai/scibert_scivocab_uncased) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.1032
19
+ - [email protected]: 0.8166
20
+ - [email protected]: 0.9988
21
+ - [email protected]: 0.8985
22
+ - [email protected]: 0.9027
23
+ - [email protected]: 0.8430
24
+ - [email protected]: 0.9980
25
+ - [email protected]: 0.9140
26
+ - [email protected]: 0.9190
27
+ - [email protected]: 0.8560
28
+ - [email protected]: 0.9973
29
+ - [email protected]: 0.9213
30
+ - [email protected]: 0.9265
31
+ - [email protected]: 0.8666
32
+ - [email protected]: 0.9966
33
+ - [email protected]: 0.9271
34
+ - [email protected]: 0.9324
35
+ - [email protected]: 0.8740
36
+ - [email protected]: 0.9960
37
+ - [email protected]: 0.9310
38
+ - [email protected]: 0.9364
39
+ - [email protected]: 0.8810
40
+ - [email protected]: 0.9959
41
+ - [email protected]: 0.9350
42
+ - [email protected]: 0.9402
43
+ - [email protected]: 0.8870
44
+ - [email protected]: 0.9950
45
+ - [email protected]: 0.9379
46
+ - [email protected]: 0.9431
47
+ - [email protected]: 0.8914
48
+ - [email protected]: 0.9950
49
+ - [email protected]: 0.9404
50
+ - [email protected]: 0.9456
51
+ - [email protected]: 0.8959
52
+ - [email protected]: 0.9947
53
+ - [email protected]: 0.9427
54
+ - [email protected]: 0.9479
55
+ - [email protected]: 0.9000
56
+ - [email protected]: 0.9942
57
+ - [email protected]: 0.9448
58
+ - [email protected]: 0.9499
59
+ - [email protected]: 0.9031
60
+ - [email protected]: 0.9938
61
+ - [email protected]: 0.9463
62
+ - [email protected]: 0.9513
63
+ - [email protected]: 0.9066
64
+ - [email protected]: 0.9931
65
+ - [email protected]: 0.9479
66
+ - [email protected]: 0.9529
67
+ - [email protected]: 0.9089
68
+ - [email protected]: 0.9929
69
+ - [email protected]: 0.9491
70
+ - [email protected]: 0.9540
71
+ - [email protected]: 0.9108
72
+ - [email protected]: 0.9923
73
+ - [email protected]: 0.9498
74
+ - [email protected]: 0.9548
75
+ - [email protected]: 0.9138
76
+ - [email protected]: 0.9918
77
+ - [email protected]: 0.9512
78
+ - [email protected]: 0.9561
79
+ - [email protected]: 0.9163
80
+ - [email protected]: 0.9913
81
+ - [email protected]: 0.9524
82
+ - [email protected]: 0.9572
83
+ - [email protected]: 0.9184
84
+ - [email protected]: 0.9912
85
+ - [email protected]: 0.9534
86
+ - [email protected]: 0.9582
87
+ - [email protected]: 0.9206
88
+ - [email protected]: 0.9908
89
+ - [email protected]: 0.9544
90
+ - [email protected]: 0.9592
91
+ - [email protected]: 0.9218
92
+ - [email protected]: 0.9902
93
+ - [email protected]: 0.9548
94
+ - [email protected]: 0.9596
95
+ - [email protected]: 0.9231
96
+ - [email protected]: 0.9899
97
+ - [email protected]: 0.9553
98
+ - [email protected]: 0.9601
99
+ - [email protected]: 0.9244
100
+ - [email protected]: 0.9896
101
+ - [email protected]: 0.9559
102
+ - [email protected]: 0.9606
103
+ - [email protected]: 0.9257
104
+ - [email protected]: 0.9893
105
+ - [email protected]: 0.9564
106
+ - [email protected]: 0.9611
107
+ - [email protected]: 0.9268
108
+ - [email protected]: 0.9890
109
+ - [email protected]: 0.9569
110
+ - [email protected]: 0.9615
111
+ - [email protected]: 0.9274
112
+ - [email protected]: 0.9889
113
+ - [email protected]: 0.9571
114
+ - [email protected]: 0.9618
115
+ - [email protected]: 0.9284
116
+ - [email protected]: 0.9884
117
+ - [email protected]: 0.9575
118
+ - [email protected]: 0.9621
119
+ - [email protected]: 0.9293
120
+ - [email protected]: 0.9882
121
+ - [email protected]: 0.9579
122
+ - [email protected]: 0.9625
123
+ - [email protected]: 0.9302
124
+ - [email protected]: 0.9881
125
+ - [email protected]: 0.9583
126
+ - [email protected]: 0.9629
127
+ - [email protected]: 0.9310
128
+ - [email protected]: 0.9881
129
+ - [email protected]: 0.9587
130
+ - [email protected]: 0.9633
131
+ - [email protected]: 0.9320
132
+ - [email protected]: 0.9879
133
+ - [email protected]: 0.9591
134
+ - [email protected]: 0.9637
135
+ - [email protected]: 0.9331
136
+ - [email protected]: 0.9876
137
+ - [email protected]: 0.9596
138
+ - [email protected]: 0.9641
139
+ - [email protected]: 0.9338
140
+ - [email protected]: 0.9875
141
+ - [email protected]: 0.9599
142
+ - [email protected]: 0.9644
143
+ - [email protected]: 0.9346
144
+ - [email protected]: 0.9874
145
+ - [email protected]: 0.9603
146
+ - [email protected]: 0.9648
147
+ - [email protected]: 0.9353
148
+ - [email protected]: 0.9872
149
+ - [email protected]: 0.9605
150
+ - [email protected]: 0.9650
151
+ - [email protected]: 0.9357
152
+ - [email protected]: 0.9872
153
+ - [email protected]: 0.9608
154
+ - [email protected]: 0.9652
155
+ - [email protected]: 0.9364
156
+ - [email protected]: 0.9871
157
+ - [email protected]: 0.9611
158
+ - [email protected]: 0.9655
159
+ - [email protected]: 0.9367
160
+ - [email protected]: 0.9869
161
+ - [email protected]: 0.9612
162
+ - [email protected]: 0.9656
163
+ - [email protected]: 0.9373
164
+ - [email protected]: 0.9867
165
+ - [email protected]: 0.9613
166
+ - [email protected]: 0.9658
167
+ - [email protected]: 0.9380
168
+ - [email protected]: 0.9863
169
+ - [email protected]: 0.9616
170
+ - [email protected]: 0.9660
171
+ - [email protected]: 0.9388
172
+ - [email protected]: 0.9860
173
+ - [email protected]: 0.9618
174
+ - [email protected]: 0.9662
175
+ - [email protected]: 0.9397
176
+ - [email protected]: 0.9852
177
+ - [email protected]: 0.9619
178
+ - [email protected]: 0.9663
179
+ - [email protected]: 0.9408
180
+ - [email protected]: 0.9850
181
+ - [email protected]: 0.9624
182
+ - [email protected]: 0.9668
183
+ - [email protected]: 0.9414
184
+ - [email protected]: 0.9849
185
+ - [email protected]: 0.9627
186
+ - [email protected]: 0.9671
187
+ - [email protected]: 0.9414
188
+ - [email protected]: 0.9840
189
+ - [email protected]: 0.9623
190
+ - [email protected]: 0.9667
191
+ - [email protected]: 0.9415
192
+ - [email protected]: 0.9840
193
+ - [email protected]: 0.9623
194
+ - [email protected]: 0.9667
195
+ - [email protected]: 0.9415
196
+ - [email protected]: 0.9840
197
+ - [email protected]: 0.9623
198
+ - [email protected]: 0.9667
199
+ - [email protected]: 0.9415
200
+ - [email protected]: 0.9840
201
+ - [email protected]: 0.9623
202
+ - [email protected]: 0.9667
203
+ - [email protected]: 0.9415
204
+ - [email protected]: 0.9840
205
+ - [email protected]: 0.9623
206
+ - [email protected]: 0.9667
207
+ - [email protected]: 0.9415
208
+ - [email protected]: 0.9840
209
+ - [email protected]: 0.9623
210
+ - [email protected]: 0.9667
211
+ - [email protected]: 0.9415
212
+ - [email protected]: 0.9840
213
+ - [email protected]: 0.9623
214
+ - [email protected]: 0.9667
215
+ - [email protected]: 0.9415
216
+ - [email protected]: 0.9840
217
+ - [email protected]: 0.9623
218
+ - [email protected]: 0.9667
219
+ - [email protected]: 0.9415
220
+ - [email protected]: 0.9840
221
+ - [email protected]: 0.9623
222
+ - [email protected]: 0.9667
223
+ - [email protected]: 0.9415
224
+ - [email protected]: 0.9840
225
+ - [email protected]: 0.9623
226
+ - [email protected]: 0.9667
227
+ - [email protected]: 0.9415
228
+ - [email protected]: 0.9840
229
+ - [email protected]: 0.9623
230
+ - [email protected]: 0.9667
231
+ - [email protected]: 0.9415
232
+ - [email protected]: 0.9840
233
+ - [email protected]: 0.9623
234
+ - [email protected]: 0.9667
235
+ - [email protected]: 0.9415
236
+ - [email protected]: 0.9840
237
+ - [email protected]: 0.9623
238
+ - [email protected]: 0.9667
239
+ - [email protected]: 0.9415
240
+ - [email protected]: 0.9839
241
+ - [email protected]: 0.9622
242
+ - [email protected]: 0.9667
243
+ - [email protected]: 0.9416
244
+ - [email protected]: 0.9839
245
+ - [email protected]: 0.9623
246
+ - [email protected]: 0.9667
247
+ - [email protected]: 0.9416
248
+ - [email protected]: 0.9839
249
+ - [email protected]: 0.9623
250
+ - [email protected]: 0.9667
251
+ - [email protected]: 0.9417
252
+ - [email protected]: 0.9839
253
+ - [email protected]: 0.9623
254
+ - [email protected]: 0.9668
255
+ - [email protected]: 0.9417
256
+ - [email protected]: 0.9838
257
+ - [email protected]: 0.9623
258
+ - [email protected]: 0.9667
259
+ - [email protected]: 0.9417
260
+ - [email protected]: 0.9838
261
+ - [email protected]: 0.9623
262
+ - [email protected]: 0.9667
263
+ - [email protected]: 0.9417
264
+ - [email protected]: 0.9837
265
+ - [email protected]: 0.9622
266
+ - [email protected]: 0.9667
267
+ - [email protected]: 0.9418
268
+ - [email protected]: 0.9837
269
+ - [email protected]: 0.9623
270
+ - [email protected]: 0.9667
271
+ - [email protected]: 0.9418
272
+ - [email protected]: 0.9835
273
+ - [email protected]: 0.9622
274
+ - [email protected]: 0.9667
275
+ - [email protected]: 0.9421
276
+ - [email protected]: 0.9834
277
+ - [email protected]: 0.9623
278
+ - [email protected]: 0.9668
279
+ - [email protected]: 0.9442
280
+ - [email protected]: 0.9827
281
+ - [email protected]: 0.9631
282
+ - [email protected]: 0.9675
283
+ - [email protected]: 0.9462
284
+ - [email protected]: 0.9812
285
+ - [email protected]: 0.9634
286
+ - [email protected]: 0.9678
287
+ - [email protected]: 0.9469
288
+ - [email protected]: 0.9806
289
+ - [email protected]: 0.9634
290
+ - [email protected]: 0.9679
291
+ - [email protected]: 0.9477
292
+ - [email protected]: 0.9801
293
+ - [email protected]: 0.9636
294
+ - [email protected]: 0.9681
295
+ - [email protected]: 0.9491
296
+ - [email protected]: 0.9794
297
+ - [email protected]: 0.9640
298
+ - [email protected]: 0.9685
299
+ - [email protected]: 0.9504
300
+ - [email protected]: 0.9792
301
+ - [email protected]: 0.9646
302
+ - [email protected]: 0.9690
303
+ - [email protected]: 0.9509
304
+ - [email protected]: 0.9787
305
+ - [email protected]: 0.9646
306
+ - [email protected]: 0.9690
307
+ - [email protected]: 0.9523
308
+ - [email protected]: 0.9782
309
+ - [email protected]: 0.9651
310
+ - [email protected]: 0.9695
311
+ - [email protected]: 0.9532
312
+ - [email protected]: 0.9775
313
+ - [email protected]: 0.9652
314
+ - [email protected]: 0.9696
315
+ - [email protected]: 0.9538
316
+ - [email protected]: 0.9771
317
+ - [email protected]: 0.9653
318
+ - [email protected]: 0.9697
319
+ - [email protected]: 0.9547
320
+ - [email protected]: 0.9767
321
+ - [email protected]: 0.9656
322
+ - [email protected]: 0.9700
323
+ - [email protected]: 0.9552
324
+ - [email protected]: 0.9757
325
+ - [email protected]: 0.9653
326
+ - [email protected]: 0.9698
327
+ - [email protected]: 0.9558
328
+ - [email protected]: 0.9752
329
+ - [email protected]: 0.9654
330
+ - [email protected]: 0.9698
331
+ - [email protected]: 0.9564
332
+ - [email protected]: 0.9741
333
+ - [email protected]: 0.9652
334
+ - [email protected]: 0.9697
335
+ - [email protected]: 0.9569
336
+ - [email protected]: 0.9737
337
+ - [email protected]: 0.9652
338
+ - [email protected]: 0.9697
339
+ - [email protected]: 0.9574
340
+ - [email protected]: 0.9729
341
+ - [email protected]: 0.9651
342
+ - [email protected]: 0.9697
343
+ - [email protected]: 0.9584
344
+ - [email protected]: 0.9726
345
+ - [email protected]: 0.9654
346
+ - [email protected]: 0.9700
347
+ - [email protected]: 0.9591
348
+ - [email protected]: 0.9719
349
+ - [email protected]: 0.9655
350
+ - [email protected]: 0.9700
351
+ - [email protected]: 0.9599
352
+ - [email protected]: 0.9714
353
+ - [email protected]: 0.9656
354
+ - [email protected]: 0.9701
355
+ - [email protected]: 0.9605
356
+ - [email protected]: 0.9705
357
+ - [email protected]: 0.9655
358
+ - [email protected]: 0.9701
359
+ - [email protected]: 0.9618
360
+ - [email protected]: 0.9693
361
+ - [email protected]: 0.9656
362
+ - [email protected]: 0.9702
363
+ - [email protected]: 0.9625
364
+ - [email protected]: 0.9681
365
+ - [email protected]: 0.9653
366
+ - [email protected]: 0.9700
367
+ - [email protected]: 0.9630
368
+ - [email protected]: 0.9670
369
+ - [email protected]: 0.9650
370
+ - [email protected]: 0.9697
371
+ - [email protected]: 0.9644
372
+ - [email protected]: 0.9662
373
+ - [email protected]: 0.9653
374
+ - [email protected]: 0.9700
375
+ - [email protected]: 0.9661
376
+ - [email protected]: 0.9649
377
+ - [email protected]: 0.9655
378
+ - [email protected]: 0.9703
379
+ - [email protected]: 0.9666
380
+ - [email protected]: 0.9636
381
+ - [email protected]: 0.9651
382
+ - [email protected]: 0.9699
383
+ - [email protected]: 0.9679
384
+ - [email protected]: 0.9612
385
+ - [email protected]: 0.9645
386
+ - [email protected]: 0.9695
387
+ - [email protected]: 0.9691
388
+ - [email protected]: 0.9585
389
+ - [email protected]: 0.9638
390
+ - [email protected]: 0.9689
391
+ - [email protected]: 0.9712
392
+ - [email protected]: 0.9550
393
+ - [email protected]: 0.9630
394
+ - [email protected]: 0.9684
395
+ - [email protected]: 0.9736
396
+ - [email protected]: 0.9493
397
+ - [email protected]: 0.9613
398
+ - [email protected]: 0.9670
399
+ - [email protected]: 0.9769
400
+ - [email protected]: 0.9414
401
+ - [email protected]: 0.9588
402
+ - [email protected]: 0.9651
403
+ - [email protected]: 0.9805
404
+ - [email protected]: 0.9291
405
+ - [email protected]: 0.9541
406
+ - [email protected]: 0.9615
407
+ - [email protected]: 0.9845
408
+ - [email protected]: 0.9055
409
+ - [email protected]: 0.9434
410
+ - [email protected]: 0.9531
411
+ - [email protected]: 0.9891
412
+ - [email protected]: 0.8632
413
+ - [email protected]: 0.9219
414
+ - [email protected]: 0.9369
415
+
416
+ ## Model description
417
+
418
+ More information needed
419
+
420
+ ## Intended uses & limitations
421
+
422
+ More information needed
423
+
424
+ ## Training and evaluation data
425
+
426
+ More information needed
427
+
428
+ ## Training procedure
429
+
430
+ ### Training hyperparameters
431
+
432
+ The following hyperparameters were used during training:
433
+ - learning_rate: 0.0001
434
+ - train_batch_size: 64
435
+ - eval_batch_size: 64
436
+ - seed: 42
437
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
438
+ - lr_scheduler_type: linear
439
+ - num_epochs: 3
440
+
441
+ ### Training results
442
+
443
+ | Training Loss | Epoch | Step | Validation Loss | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] | [email protected] |
444
+ |:-------------:|:-----:|:----:|:---------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:-------------:|:----------:|:------:|:------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:-------------:|:----------:|:------:|:------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:-------------:|:----------:|:------:|:------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:-------------:|:----------:|:------:|:------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:-------------:|:----------:|:------:|:------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:-------------:|:----------:|:------:|:------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:-------------:|:----------:|:------:|:------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:-------------:|:----------:|:------:|:------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:-------------:|:----------:|:------:|:------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|:--------------:|:-----------:|:-------:|:-------------:|
445
+ | 0.6601 | 1.0 | 2436 | 0.6623 | 0.4313 | 1.0 | 0.6027 | 0.4313 | 0.4313 | 1.0 | 0.6027 | 0.4313 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.4498 | 0.9989 | 0.6203 | 0.4725 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 |
446
+ | 0.2161 | 2.0 | 4872 | 0.3014 | 0.4313 | 1.0 | 0.6027 | 0.4313 | 0.4313 | 1.0 | 0.6027 | 0.4313 | 0.4313 | 1.0 | 0.6027 | 0.4313 | 0.8328 | 0.9731 | 0.8975 | 0.9042 | 0.8331 | 0.9722 | 0.8973 | 0.9040 | 0.8332 | 0.9722 | 0.8974 | 0.9041 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8332 | 0.9721 | 0.8973 | 0.9040 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9721 | 0.8973 | 0.9041 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8333 | 0.9720 | 0.8973 | 0.9040 | 0.8332 | 0.9719 | 0.8972 | 0.9040 | 0.8332 | 0.9719 | 0.8972 | 0.9040 | 0.8332 | 0.9719 | 0.8972 | 0.9040 | 0.8332 | 0.9718 | 0.8972 | 0.9039 | 0.8333 | 0.9716 | 0.8971 | 0.9039 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 | 0.0 | 0.0 | 0.0 | 0.5687 |
447
+ | 0.0569 | 3.0 | 7308 | 0.1032 | 0.8166 | 0.9988 | 0.8985 | 0.9027 | 0.8430 | 0.9980 | 0.9140 | 0.9190 | 0.8560 | 0.9973 | 0.9213 | 0.9265 | 0.8666 | 0.9966 | 0.9271 | 0.9324 | 0.8740 | 0.9960 | 0.9310 | 0.9364 | 0.8810 | 0.9959 | 0.9350 | 0.9402 | 0.8870 | 0.9950 | 0.9379 | 0.9431 | 0.8914 | 0.9950 | 0.9404 | 0.9456 | 0.8959 | 0.9947 | 0.9427 | 0.9479 | 0.9000 | 0.9942 | 0.9448 | 0.9499 | 0.9031 | 0.9938 | 0.9463 | 0.9513 | 0.9066 | 0.9931 | 0.9479 | 0.9529 | 0.9089 | 0.9929 | 0.9491 | 0.9540 | 0.9108 | 0.9923 | 0.9498 | 0.9548 | 0.9138 | 0.9918 | 0.9512 | 0.9561 | 0.9163 | 0.9913 | 0.9524 | 0.9572 | 0.9184 | 0.9912 | 0.9534 | 0.9582 | 0.9206 | 0.9908 | 0.9544 | 0.9592 | 0.9218 | 0.9902 | 0.9548 | 0.9596 | 0.9231 | 0.9899 | 0.9553 | 0.9601 | 0.9244 | 0.9896 | 0.9559 | 0.9606 | 0.9257 | 0.9893 | 0.9564 | 0.9611 | 0.9268 | 0.9890 | 0.9569 | 0.9615 | 0.9274 | 0.9889 | 0.9571 | 0.9618 | 0.9284 | 0.9884 | 0.9575 | 0.9621 | 0.9293 | 0.9882 | 0.9579 | 0.9625 | 0.9302 | 0.9881 | 0.9583 | 0.9629 | 0.9310 | 0.9881 | 0.9587 | 0.9633 | 0.9320 | 0.9879 | 0.9591 | 0.9637 | 0.9331 | 0.9876 | 0.9596 | 0.9641 | 0.9338 | 0.9875 | 0.9599 | 0.9644 | 0.9346 | 0.9874 | 0.9603 | 0.9648 | 0.9353 | 0.9872 | 0.9605 | 0.9650 | 0.9357 | 0.9872 | 0.9608 | 0.9652 | 0.9364 | 0.9871 | 0.9611 | 0.9655 | 0.9367 | 0.9869 | 0.9612 | 0.9656 | 0.9373 | 0.9867 | 0.9613 | 0.9658 | 0.9380 | 0.9863 | 0.9616 | 0.9660 | 0.9388 | 0.9860 | 0.9618 | 0.9662 | 0.9397 | 0.9852 | 0.9619 | 0.9663 | 0.9408 | 0.9850 | 0.9624 | 0.9668 | 0.9414 | 0.9849 | 0.9627 | 0.9671 | 0.9414 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9840 | 0.9623 | 0.9667 | 0.9415 | 0.9839 | 0.9622 | 0.9667 | 0.9416 | 0.9839 | 0.9623 | 0.9667 | 0.9416 | 0.9839 | 0.9623 | 0.9667 | 0.9417 | 0.9839 | 0.9623 | 0.9668 | 0.9417 | 0.9838 | 0.9623 | 0.9667 | 0.9417 | 0.9838 | 0.9623 | 0.9667 | 0.9417 | 0.9837 | 0.9622 | 0.9667 | 0.9418 | 0.9837 | 0.9623 | 0.9667 | 0.9418 | 0.9835 | 0.9622 | 0.9667 | 0.9421 | 0.9834 | 0.9623 | 0.9668 | 0.9442 | 0.9827 | 0.9631 | 0.9675 | 0.9462 | 0.9812 | 0.9634 | 0.9678 | 0.9469 | 0.9806 | 0.9634 | 0.9679 | 0.9477 | 0.9801 | 0.9636 | 0.9681 | 0.9491 | 0.9794 | 0.9640 | 0.9685 | 0.9504 | 0.9792 | 0.9646 | 0.9690 | 0.9509 | 0.9787 | 0.9646 | 0.9690 | 0.9523 | 0.9782 | 0.9651 | 0.9695 | 0.9532 | 0.9775 | 0.9652 | 0.9696 | 0.9538 | 0.9771 | 0.9653 | 0.9697 | 0.9547 | 0.9767 | 0.9656 | 0.9700 | 0.9552 | 0.9757 | 0.9653 | 0.9698 | 0.9558 | 0.9752 | 0.9654 | 0.9698 | 0.9564 | 0.9741 | 0.9652 | 0.9697 | 0.9569 | 0.9737 | 0.9652 | 0.9697 | 0.9574 | 0.9729 | 0.9651 | 0.9697 | 0.9584 | 0.9726 | 0.9654 | 0.9700 | 0.9591 | 0.9719 | 0.9655 | 0.9700 | 0.9599 | 0.9714 | 0.9656 | 0.9701 | 0.9605 | 0.9705 | 0.9655 | 0.9701 | 0.9618 | 0.9693 | 0.9656 | 0.9702 | 0.9625 | 0.9681 | 0.9653 | 0.9700 | 0.9630 | 0.9670 | 0.9650 | 0.9697 | 0.9644 | 0.9662 | 0.9653 | 0.9700 | 0.9661 | 0.9649 | 0.9655 | 0.9703 | 0.9666 | 0.9636 | 0.9651 | 0.9699 | 0.9679 | 0.9612 | 0.9645 | 0.9695 | 0.9691 | 0.9585 | 0.9638 | 0.9689 | 0.9712 | 0.9550 | 0.9630 | 0.9684 | 0.9736 | 0.9493 | 0.9613 | 0.9670 | 0.9769 | 0.9414 | 0.9588 | 0.9651 | 0.9805 | 0.9291 | 0.9541 | 0.9615 | 0.9845 | 0.9055 | 0.9434 | 0.9531 | 0.9891 | 0.8632 | 0.9219 | 0.9369 |
448
+
449
+
450
+ ### Framework versions
451
+
452
+ - Transformers 4.48.0.dev0
453
+ - Pytorch 2.5.1+cu124
454
+ - Datasets 3.2.0
455
+ - Tokenizers 0.21.0
best_model/config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "allenai/scibert_scivocab_uncased",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 3072,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 12,
17
+ "num_hidden_layers": 12,
18
+ "pad_token_id": 0,
19
+ "position_embedding_type": "absolute",
20
+ "problem_type": "single_label_classification",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.48.0.dev0",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 31090
26
+ }
best_model/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1cd9988f32cc076034fa088b7fc1b482bccc60f5e645d54661c3034a090bc4d0
3
+ size 439703544
best_model/special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
best_model/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
best_model/tokenizer_config.json ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "101": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "102": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "103": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "104": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": true,
48
+ "extra_special_tokens": {},
49
+ "mask_token": "[MASK]",
50
+ "model_max_length": 1000000000000000019884624838656,
51
+ "never_split": null,
52
+ "pad_token": "[PAD]",
53
+ "sep_token": "[SEP]",
54
+ "strip_accents": null,
55
+ "tokenize_chinese_chars": true,
56
+ "tokenizer_class": "BertTokenizer",
57
+ "unk_token": "[UNK]"
58
+ }
best_model/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4b394b2580e4601c43e6b87b14cc1338633a74c5656c0417be2ec4e302c067c0
3
+ size 5496
best_model/vocab.txt ADDED
The diff for this file is too large to render. See raw diff
 
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "allenai/scibert_scivocab_uncased",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 3072,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 12,
17
+ "num_hidden_layers": 12,
18
+ "pad_token_id": 0,
19
+ "position_embedding_type": "absolute",
20
+ "problem_type": "single_label_classification",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.48.0.dev0",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 31090
26
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1cd9988f32cc076034fa088b7fc1b482bccc60f5e645d54661c3034a090bc4d0
3
+ size 439703544
special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "101": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "102": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "103": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "104": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": true,
48
+ "extra_special_tokens": {},
49
+ "mask_token": "[MASK]",
50
+ "model_max_length": 1000000000000000019884624838656,
51
+ "never_split": null,
52
+ "pad_token": "[PAD]",
53
+ "sep_token": "[SEP]",
54
+ "strip_accents": null,
55
+ "tokenize_chinese_chars": true,
56
+ "tokenizer_class": "BertTokenizer",
57
+ "unk_token": "[UNK]"
58
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4b394b2580e4601c43e6b87b14cc1338633a74c5656c0417be2ec4e302c067c0
3
+ size 5496
vocab.txt ADDED
The diff for this file is too large to render. See raw diff