|
2023-10-18 22:35:32,803 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:32,803 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): BertModel( |
|
(embeddings): BertEmbeddings( |
|
(word_embeddings): Embedding(32001, 128) |
|
(position_embeddings): Embedding(512, 128) |
|
(token_type_embeddings): Embedding(2, 128) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): BertEncoder( |
|
(layer): ModuleList( |
|
(0-1): 2 x BertLayer( |
|
(attention): BertAttention( |
|
(self): BertSelfAttention( |
|
(query): Linear(in_features=128, out_features=128, bias=True) |
|
(key): Linear(in_features=128, out_features=128, bias=True) |
|
(value): Linear(in_features=128, out_features=128, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): BertSelfOutput( |
|
(dense): Linear(in_features=128, out_features=128, bias=True) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): BertIntermediate( |
|
(dense): Linear(in_features=128, out_features=512, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): BertOutput( |
|
(dense): Linear(in_features=512, out_features=128, bias=True) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(pooler): BertPooler( |
|
(dense): Linear(in_features=128, out_features=128, bias=True) |
|
(activation): Tanh() |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=128, out_features=13, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-10-18 22:35:32,803 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:32,803 MultiCorpus: 5777 train + 722 dev + 723 test sentences |
|
- NER_ICDAR_EUROPEANA Corpus: 5777 train + 722 dev + 723 test sentences - /root/.flair/datasets/ner_icdar_europeana/nl |
|
2023-10-18 22:35:32,803 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:32,803 Train: 5777 sentences |
|
2023-10-18 22:35:32,803 (train_with_dev=False, train_with_test=False) |
|
2023-10-18 22:35:32,803 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:32,803 Training Params: |
|
2023-10-18 22:35:32,803 - learning_rate: "5e-05" |
|
2023-10-18 22:35:32,803 - mini_batch_size: "4" |
|
2023-10-18 22:35:32,804 - max_epochs: "10" |
|
2023-10-18 22:35:32,804 - shuffle: "True" |
|
2023-10-18 22:35:32,804 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:32,804 Plugins: |
|
2023-10-18 22:35:32,804 - TensorboardLogger |
|
2023-10-18 22:35:32,804 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-10-18 22:35:32,804 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:32,804 Final evaluation on model from best epoch (best-model.pt) |
|
2023-10-18 22:35:32,804 - metric: "('micro avg', 'f1-score')" |
|
2023-10-18 22:35:32,804 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:32,804 Computation: |
|
2023-10-18 22:35:32,804 - compute on device: cuda:0 |
|
2023-10-18 22:35:32,804 - embedding storage: none |
|
2023-10-18 22:35:32,804 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:32,804 Model training base path: "hmbench-icdar/nl-dbmdz/bert-tiny-historic-multilingual-cased-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3" |
|
2023-10-18 22:35:32,804 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:32,804 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:32,804 Logging anything other than scalars to TensorBoard is currently not supported. |
|
2023-10-18 22:35:35,293 epoch 1 - iter 144/1445 - loss 2.86625285 - time (sec): 2.49 - samples/sec: 7167.51 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-18 22:35:37,764 epoch 1 - iter 288/1445 - loss 2.39182936 - time (sec): 4.96 - samples/sec: 7294.65 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-18 22:35:40,191 epoch 1 - iter 432/1445 - loss 1.86271468 - time (sec): 7.39 - samples/sec: 7225.61 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-18 22:35:42,586 epoch 1 - iter 576/1445 - loss 1.48735779 - time (sec): 9.78 - samples/sec: 7273.86 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-18 22:35:44,941 epoch 1 - iter 720/1445 - loss 1.24724928 - time (sec): 12.14 - samples/sec: 7369.69 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-18 22:35:47,348 epoch 1 - iter 864/1445 - loss 1.09616008 - time (sec): 14.54 - samples/sec: 7359.26 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-18 22:35:49,769 epoch 1 - iter 1008/1445 - loss 0.97810336 - time (sec): 16.96 - samples/sec: 7381.19 - lr: 0.000035 - momentum: 0.000000 |
|
2023-10-18 22:35:52,205 epoch 1 - iter 1152/1445 - loss 0.89270314 - time (sec): 19.40 - samples/sec: 7340.76 - lr: 0.000040 - momentum: 0.000000 |
|
2023-10-18 22:35:54,547 epoch 1 - iter 1296/1445 - loss 0.82673392 - time (sec): 21.74 - samples/sec: 7317.91 - lr: 0.000045 - momentum: 0.000000 |
|
2023-10-18 22:35:56,894 epoch 1 - iter 1440/1445 - loss 0.77159928 - time (sec): 24.09 - samples/sec: 7297.89 - lr: 0.000050 - momentum: 0.000000 |
|
2023-10-18 22:35:56,976 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:35:56,976 EPOCH 1 done: loss 0.7706 - lr: 0.000050 |
|
2023-10-18 22:35:58,210 DEV : loss 0.2561441659927368 - f1-score (micro avg) 0.1895 |
|
2023-10-18 22:35:58,224 saving best model |
|
2023-10-18 22:35:58,255 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:36:00,645 epoch 2 - iter 144/1445 - loss 0.20788149 - time (sec): 2.39 - samples/sec: 7083.52 - lr: 0.000049 - momentum: 0.000000 |
|
2023-10-18 22:36:03,009 epoch 2 - iter 288/1445 - loss 0.21650715 - time (sec): 4.75 - samples/sec: 7262.68 - lr: 0.000049 - momentum: 0.000000 |
|
2023-10-18 22:36:05,512 epoch 2 - iter 432/1445 - loss 0.20552295 - time (sec): 7.26 - samples/sec: 7238.70 - lr: 0.000048 - momentum: 0.000000 |
|
2023-10-18 22:36:07,838 epoch 2 - iter 576/1445 - loss 0.20926347 - time (sec): 9.58 - samples/sec: 7233.85 - lr: 0.000048 - momentum: 0.000000 |
|
2023-10-18 22:36:10,169 epoch 2 - iter 720/1445 - loss 0.20584467 - time (sec): 11.91 - samples/sec: 7258.79 - lr: 0.000047 - momentum: 0.000000 |
|
2023-10-18 22:36:12,625 epoch 2 - iter 864/1445 - loss 0.20048058 - time (sec): 14.37 - samples/sec: 7265.04 - lr: 0.000047 - momentum: 0.000000 |
|
2023-10-18 22:36:15,104 epoch 2 - iter 1008/1445 - loss 0.20118581 - time (sec): 16.85 - samples/sec: 7250.86 - lr: 0.000046 - momentum: 0.000000 |
|
2023-10-18 22:36:17,549 epoch 2 - iter 1152/1445 - loss 0.19964640 - time (sec): 19.29 - samples/sec: 7315.99 - lr: 0.000046 - momentum: 0.000000 |
|
2023-10-18 22:36:19,981 epoch 2 - iter 1296/1445 - loss 0.19527663 - time (sec): 21.73 - samples/sec: 7309.63 - lr: 0.000045 - momentum: 0.000000 |
|
2023-10-18 22:36:22,379 epoch 2 - iter 1440/1445 - loss 0.19935605 - time (sec): 24.12 - samples/sec: 7281.82 - lr: 0.000044 - momentum: 0.000000 |
|
2023-10-18 22:36:22,457 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:36:22,458 EPOCH 2 done: loss 0.1995 - lr: 0.000044 |
|
2023-10-18 22:36:24,553 DEV : loss 0.2152344286441803 - f1-score (micro avg) 0.3519 |
|
2023-10-18 22:36:24,567 saving best model |
|
2023-10-18 22:36:24,602 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:36:26,941 epoch 3 - iter 144/1445 - loss 0.18735705 - time (sec): 2.34 - samples/sec: 7338.15 - lr: 0.000044 - momentum: 0.000000 |
|
2023-10-18 22:36:29,358 epoch 3 - iter 288/1445 - loss 0.16728570 - time (sec): 4.76 - samples/sec: 7410.31 - lr: 0.000043 - momentum: 0.000000 |
|
2023-10-18 22:36:31,777 epoch 3 - iter 432/1445 - loss 0.16394077 - time (sec): 7.17 - samples/sec: 7362.02 - lr: 0.000043 - momentum: 0.000000 |
|
2023-10-18 22:36:34,039 epoch 3 - iter 576/1445 - loss 0.16749371 - time (sec): 9.44 - samples/sec: 7493.93 - lr: 0.000042 - momentum: 0.000000 |
|
2023-10-18 22:36:36,322 epoch 3 - iter 720/1445 - loss 0.16886016 - time (sec): 11.72 - samples/sec: 7376.05 - lr: 0.000042 - momentum: 0.000000 |
|
2023-10-18 22:36:38,727 epoch 3 - iter 864/1445 - loss 0.16886698 - time (sec): 14.12 - samples/sec: 7388.63 - lr: 0.000041 - momentum: 0.000000 |
|
2023-10-18 22:36:41,158 epoch 3 - iter 1008/1445 - loss 0.16881585 - time (sec): 16.55 - samples/sec: 7410.17 - lr: 0.000041 - momentum: 0.000000 |
|
2023-10-18 22:36:43,478 epoch 3 - iter 1152/1445 - loss 0.16991782 - time (sec): 18.87 - samples/sec: 7372.83 - lr: 0.000040 - momentum: 0.000000 |
|
2023-10-18 22:36:45,934 epoch 3 - iter 1296/1445 - loss 0.17156863 - time (sec): 21.33 - samples/sec: 7402.98 - lr: 0.000039 - momentum: 0.000000 |
|
2023-10-18 22:36:48,481 epoch 3 - iter 1440/1445 - loss 0.16631620 - time (sec): 23.88 - samples/sec: 7358.61 - lr: 0.000039 - momentum: 0.000000 |
|
2023-10-18 22:36:48,556 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:36:48,557 EPOCH 3 done: loss 0.1663 - lr: 0.000039 |
|
2023-10-18 22:36:50,325 DEV : loss 0.2009856253862381 - f1-score (micro avg) 0.4532 |
|
2023-10-18 22:36:50,339 saving best model |
|
2023-10-18 22:36:50,375 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:36:52,749 epoch 4 - iter 144/1445 - loss 0.15362789 - time (sec): 2.37 - samples/sec: 7481.75 - lr: 0.000038 - momentum: 0.000000 |
|
2023-10-18 22:36:55,135 epoch 4 - iter 288/1445 - loss 0.15463328 - time (sec): 4.76 - samples/sec: 7160.04 - lr: 0.000038 - momentum: 0.000000 |
|
2023-10-18 22:36:57,639 epoch 4 - iter 432/1445 - loss 0.15709099 - time (sec): 7.26 - samples/sec: 7172.16 - lr: 0.000037 - momentum: 0.000000 |
|
2023-10-18 22:37:00,051 epoch 4 - iter 576/1445 - loss 0.14884886 - time (sec): 9.68 - samples/sec: 7236.95 - lr: 0.000037 - momentum: 0.000000 |
|
2023-10-18 22:37:02,563 epoch 4 - iter 720/1445 - loss 0.14765128 - time (sec): 12.19 - samples/sec: 7264.09 - lr: 0.000036 - momentum: 0.000000 |
|
2023-10-18 22:37:04,985 epoch 4 - iter 864/1445 - loss 0.14686407 - time (sec): 14.61 - samples/sec: 7275.14 - lr: 0.000036 - momentum: 0.000000 |
|
2023-10-18 22:37:07,366 epoch 4 - iter 1008/1445 - loss 0.14711603 - time (sec): 16.99 - samples/sec: 7239.14 - lr: 0.000035 - momentum: 0.000000 |
|
2023-10-18 22:37:09,825 epoch 4 - iter 1152/1445 - loss 0.14929145 - time (sec): 19.45 - samples/sec: 7229.84 - lr: 0.000034 - momentum: 0.000000 |
|
2023-10-18 22:37:12,202 epoch 4 - iter 1296/1445 - loss 0.14891095 - time (sec): 21.83 - samples/sec: 7256.22 - lr: 0.000034 - momentum: 0.000000 |
|
2023-10-18 22:37:14,697 epoch 4 - iter 1440/1445 - loss 0.14884730 - time (sec): 24.32 - samples/sec: 7222.75 - lr: 0.000033 - momentum: 0.000000 |
|
2023-10-18 22:37:14,778 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:37:14,779 EPOCH 4 done: loss 0.1487 - lr: 0.000033 |
|
2023-10-18 22:37:16,559 DEV : loss 0.18706543743610382 - f1-score (micro avg) 0.5193 |
|
2023-10-18 22:37:16,573 saving best model |
|
2023-10-18 22:37:16,608 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:37:18,997 epoch 5 - iter 144/1445 - loss 0.13686569 - time (sec): 2.39 - samples/sec: 7121.40 - lr: 0.000033 - momentum: 0.000000 |
|
2023-10-18 22:37:21,388 epoch 5 - iter 288/1445 - loss 0.12889149 - time (sec): 4.78 - samples/sec: 7193.65 - lr: 0.000032 - momentum: 0.000000 |
|
2023-10-18 22:37:23,809 epoch 5 - iter 432/1445 - loss 0.13048179 - time (sec): 7.20 - samples/sec: 7178.96 - lr: 0.000032 - momentum: 0.000000 |
|
2023-10-18 22:37:26,189 epoch 5 - iter 576/1445 - loss 0.13492520 - time (sec): 9.58 - samples/sec: 7161.88 - lr: 0.000031 - momentum: 0.000000 |
|
2023-10-18 22:37:28,631 epoch 5 - iter 720/1445 - loss 0.13512870 - time (sec): 12.02 - samples/sec: 7268.82 - lr: 0.000031 - momentum: 0.000000 |
|
2023-10-18 22:37:31,049 epoch 5 - iter 864/1445 - loss 0.13317304 - time (sec): 14.44 - samples/sec: 7327.92 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-18 22:37:33,468 epoch 5 - iter 1008/1445 - loss 0.13354051 - time (sec): 16.86 - samples/sec: 7313.61 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-18 22:37:35,963 epoch 5 - iter 1152/1445 - loss 0.13644795 - time (sec): 19.35 - samples/sec: 7314.08 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-18 22:37:38,334 epoch 5 - iter 1296/1445 - loss 0.13740177 - time (sec): 21.72 - samples/sec: 7269.49 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-18 22:37:40,700 epoch 5 - iter 1440/1445 - loss 0.13509823 - time (sec): 24.09 - samples/sec: 7284.09 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-18 22:37:40,787 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:37:40,788 EPOCH 5 done: loss 0.1353 - lr: 0.000028 |
|
2023-10-18 22:37:42,905 DEV : loss 0.19456517696380615 - f1-score (micro avg) 0.5458 |
|
2023-10-18 22:37:42,919 saving best model |
|
2023-10-18 22:37:42,954 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:37:45,320 epoch 6 - iter 144/1445 - loss 0.11571536 - time (sec): 2.37 - samples/sec: 7102.98 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-18 22:37:47,734 epoch 6 - iter 288/1445 - loss 0.12384211 - time (sec): 4.78 - samples/sec: 7258.64 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-18 22:37:50,160 epoch 6 - iter 432/1445 - loss 0.13225920 - time (sec): 7.21 - samples/sec: 7253.70 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-18 22:37:52,606 epoch 6 - iter 576/1445 - loss 0.12932638 - time (sec): 9.65 - samples/sec: 7336.81 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-18 22:37:55,094 epoch 6 - iter 720/1445 - loss 0.13232474 - time (sec): 12.14 - samples/sec: 7421.31 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-18 22:37:57,438 epoch 6 - iter 864/1445 - loss 0.13260978 - time (sec): 14.48 - samples/sec: 7330.43 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-18 22:37:59,561 epoch 6 - iter 1008/1445 - loss 0.13037248 - time (sec): 16.61 - samples/sec: 7438.74 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-18 22:38:01,644 epoch 6 - iter 1152/1445 - loss 0.12744445 - time (sec): 18.69 - samples/sec: 7497.79 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-18 22:38:03,752 epoch 6 - iter 1296/1445 - loss 0.12733070 - time (sec): 20.80 - samples/sec: 7576.59 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-18 22:38:05,837 epoch 6 - iter 1440/1445 - loss 0.12752773 - time (sec): 22.88 - samples/sec: 7676.69 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-18 22:38:05,905 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:38:05,905 EPOCH 6 done: loss 0.1276 - lr: 0.000022 |
|
2023-10-18 22:38:07,688 DEV : loss 0.17982900142669678 - f1-score (micro avg) 0.5598 |
|
2023-10-18 22:38:07,703 saving best model |
|
2023-10-18 22:38:07,740 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:38:09,970 epoch 7 - iter 144/1445 - loss 0.12277903 - time (sec): 2.23 - samples/sec: 8485.81 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-18 22:38:12,362 epoch 7 - iter 288/1445 - loss 0.12334114 - time (sec): 4.62 - samples/sec: 8187.85 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-18 22:38:14,733 epoch 7 - iter 432/1445 - loss 0.12246111 - time (sec): 6.99 - samples/sec: 7768.69 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-18 22:38:17,119 epoch 7 - iter 576/1445 - loss 0.11964029 - time (sec): 9.38 - samples/sec: 7702.34 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-18 22:38:19,570 epoch 7 - iter 720/1445 - loss 0.12254055 - time (sec): 11.83 - samples/sec: 7662.96 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-18 22:38:21,895 epoch 7 - iter 864/1445 - loss 0.12133982 - time (sec): 14.15 - samples/sec: 7530.25 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-18 22:38:24,278 epoch 7 - iter 1008/1445 - loss 0.12022597 - time (sec): 16.54 - samples/sec: 7530.13 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-18 22:38:26,625 epoch 7 - iter 1152/1445 - loss 0.11991213 - time (sec): 18.88 - samples/sec: 7461.88 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-18 22:38:29,015 epoch 7 - iter 1296/1445 - loss 0.12152671 - time (sec): 21.27 - samples/sec: 7436.68 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-18 22:38:31,393 epoch 7 - iter 1440/1445 - loss 0.11965156 - time (sec): 23.65 - samples/sec: 7414.95 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-18 22:38:31,479 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:38:31,479 EPOCH 7 done: loss 0.1195 - lr: 0.000017 |
|
2023-10-18 22:38:33,243 DEV : loss 0.17696666717529297 - f1-score (micro avg) 0.5872 |
|
2023-10-18 22:38:33,257 saving best model |
|
2023-10-18 22:38:33,291 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:38:35,714 epoch 8 - iter 144/1445 - loss 0.11113135 - time (sec): 2.42 - samples/sec: 7759.48 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-18 22:38:38,069 epoch 8 - iter 288/1445 - loss 0.11518653 - time (sec): 4.78 - samples/sec: 7472.68 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-18 22:38:40,463 epoch 8 - iter 432/1445 - loss 0.11080588 - time (sec): 7.17 - samples/sec: 7551.22 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-18 22:38:42,881 epoch 8 - iter 576/1445 - loss 0.11458603 - time (sec): 9.59 - samples/sec: 7566.20 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-18 22:38:45,211 epoch 8 - iter 720/1445 - loss 0.11325887 - time (sec): 11.92 - samples/sec: 7521.84 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-18 22:38:47,586 epoch 8 - iter 864/1445 - loss 0.11535133 - time (sec): 14.29 - samples/sec: 7499.72 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-18 22:38:49,927 epoch 8 - iter 1008/1445 - loss 0.11466974 - time (sec): 16.64 - samples/sec: 7399.78 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-18 22:38:52,351 epoch 8 - iter 1152/1445 - loss 0.11643710 - time (sec): 19.06 - samples/sec: 7432.46 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-18 22:38:54,772 epoch 8 - iter 1296/1445 - loss 0.11547388 - time (sec): 21.48 - samples/sec: 7376.25 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-18 22:38:57,156 epoch 8 - iter 1440/1445 - loss 0.11315324 - time (sec): 23.86 - samples/sec: 7365.81 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-18 22:38:57,228 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:38:57,228 EPOCH 8 done: loss 0.1134 - lr: 0.000011 |
|
2023-10-18 22:38:59,325 DEV : loss 0.18520388007164001 - f1-score (micro avg) 0.5793 |
|
2023-10-18 22:38:59,341 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:39:01,877 epoch 9 - iter 144/1445 - loss 0.10887282 - time (sec): 2.54 - samples/sec: 7397.71 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-18 22:39:04,328 epoch 9 - iter 288/1445 - loss 0.11482268 - time (sec): 4.99 - samples/sec: 7375.84 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-18 22:39:06,789 epoch 9 - iter 432/1445 - loss 0.11164932 - time (sec): 7.45 - samples/sec: 7362.59 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-18 22:39:09,250 epoch 9 - iter 576/1445 - loss 0.11268148 - time (sec): 9.91 - samples/sec: 7377.27 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-18 22:39:11,620 epoch 9 - iter 720/1445 - loss 0.11314413 - time (sec): 12.28 - samples/sec: 7299.42 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-18 22:39:14,036 epoch 9 - iter 864/1445 - loss 0.11255961 - time (sec): 14.69 - samples/sec: 7329.47 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-18 22:39:16,437 epoch 9 - iter 1008/1445 - loss 0.11294305 - time (sec): 17.10 - samples/sec: 7310.68 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-18 22:39:18,815 epoch 9 - iter 1152/1445 - loss 0.11188029 - time (sec): 19.47 - samples/sec: 7268.82 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-18 22:39:21,293 epoch 9 - iter 1296/1445 - loss 0.10885370 - time (sec): 21.95 - samples/sec: 7270.80 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-18 22:39:23,515 epoch 9 - iter 1440/1445 - loss 0.10971729 - time (sec): 24.17 - samples/sec: 7267.20 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-18 22:39:23,585 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:39:23,586 EPOCH 9 done: loss 0.1097 - lr: 0.000006 |
|
2023-10-18 22:39:25,383 DEV : loss 0.1853523552417755 - f1-score (micro avg) 0.5984 |
|
2023-10-18 22:39:25,398 saving best model |
|
2023-10-18 22:39:25,433 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:39:27,799 epoch 10 - iter 144/1445 - loss 0.09962782 - time (sec): 2.37 - samples/sec: 7376.05 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-18 22:39:30,273 epoch 10 - iter 288/1445 - loss 0.08889531 - time (sec): 4.84 - samples/sec: 7309.52 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-18 22:39:32,650 epoch 10 - iter 432/1445 - loss 0.09811058 - time (sec): 7.22 - samples/sec: 7371.91 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-18 22:39:35,027 epoch 10 - iter 576/1445 - loss 0.10124252 - time (sec): 9.59 - samples/sec: 7326.03 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-18 22:39:37,386 epoch 10 - iter 720/1445 - loss 0.10043443 - time (sec): 11.95 - samples/sec: 7433.53 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-18 22:39:39,866 epoch 10 - iter 864/1445 - loss 0.10093884 - time (sec): 14.43 - samples/sec: 7359.99 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-18 22:39:42,282 epoch 10 - iter 1008/1445 - loss 0.10082663 - time (sec): 16.85 - samples/sec: 7272.67 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-18 22:39:44,701 epoch 10 - iter 1152/1445 - loss 0.10177923 - time (sec): 19.27 - samples/sec: 7310.18 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-18 22:39:47,102 epoch 10 - iter 1296/1445 - loss 0.10379308 - time (sec): 21.67 - samples/sec: 7271.08 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-18 22:39:49,469 epoch 10 - iter 1440/1445 - loss 0.10584891 - time (sec): 24.04 - samples/sec: 7302.09 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-18 22:39:49,547 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:39:49,547 EPOCH 10 done: loss 0.1055 - lr: 0.000000 |
|
2023-10-18 22:39:51,338 DEV : loss 0.18715591728687286 - f1-score (micro avg) 0.5955 |
|
2023-10-18 22:39:51,383 ---------------------------------------------------------------------------------------------------- |
|
2023-10-18 22:39:51,383 Loading model from best epoch ... |
|
2023-10-18 22:39:51,466 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG |
|
2023-10-18 22:39:52,786 |
|
Results: |
|
- F-score (micro) 0.6129 |
|
- F-score (macro) 0.4332 |
|
- Accuracy 0.4549 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
LOC 0.6827 0.7140 0.6980 458 |
|
PER 0.6284 0.5332 0.5769 482 |
|
ORG 0.0833 0.0145 0.0247 69 |
|
|
|
micro avg 0.6500 0.5798 0.6129 1009 |
|
macro avg 0.4648 0.4206 0.4332 1009 |
|
weighted avg 0.6157 0.5798 0.5941 1009 |
|
|
|
2023-10-18 22:39:52,786 ---------------------------------------------------------------------------------------------------- |
|
|