dile3 commited on
Commit
805802e
·
verified ·
1 Parent(s): f3fdc5c

dile3/biobert-ner-ncbi

Browse files
Files changed (3) hide show
  1. README.md +15 -14
  2. model.safetensors +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -15,15 +15,15 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [dmis-lab/biobert-base-cased-v1.1](https://huggingface.co/dmis-lab/biobert-base-cased-v1.1) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.0313
19
  - Compositemention: {'precision': 0.8, 'recall': 0.9142857142857143, 'f1': 0.8533333333333333, 'number': 35}
20
- - Diseaseclass: {'precision': 0.6180555555555556, 'recall': 0.7063492063492064, 'f1': 0.6592592592592594, 'number': 126}
21
- - Modifier: {'precision': 0.7419354838709677, 'recall': 0.8598130841121495, 'f1': 0.7965367965367967, 'number': 214}
22
- - Specificdisease: {'precision': 0.8409638554216867, 'recall': 0.8470873786407767, 'f1': 0.8440145102781136, 'number': 412}
23
- - Overall Precision: 0.7721
24
- - Overall Recall: 0.8310
25
- - Overall F1: 0.8005
26
- - Overall Accuracy: 0.9945
27
 
28
  ## Model description
29
 
@@ -48,17 +48,18 @@ The following hyperparameters were used during training:
48
  - seed: 42
49
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
50
  - lr_scheduler_type: linear
51
- - num_epochs: 5
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Compositemention | Diseaseclass | Modifier | Specificdisease | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
56
  |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
57
- | 0.009 | 1.0 | 359 | 0.0256 | {'precision': 0.7142857142857143, 'recall': 0.8571428571428571, 'f1': 0.7792207792207793, 'number': 35} | {'precision': 0.5939849624060151, 'recall': 0.626984126984127, 'f1': 0.61003861003861, 'number': 126} | {'precision': 0.7364016736401674, 'recall': 0.822429906542056, 'f1': 0.7770419426048566, 'number': 214} | {'precision': 0.7966507177033493, 'recall': 0.808252427184466, 'f1': 0.8024096385542169, 'number': 412} | 0.7428 | 0.7853 | 0.7634 | 0.9939 |
58
- | 0.0058 | 2.0 | 718 | 0.0264 | {'precision': 0.717948717948718, 'recall': 0.8, 'f1': 0.7567567567567569, 'number': 35} | {'precision': 0.5460122699386503, 'recall': 0.7063492063492064, 'f1': 0.6159169550173009, 'number': 126} | {'precision': 0.7627118644067796, 'recall': 0.8411214953271028, 'f1': 0.7999999999999999, 'number': 214} | {'precision': 0.8114558472553699, 'recall': 0.8252427184466019, 'f1': 0.8182912154031287, 'number': 412} | 0.7433 | 0.8094 | 0.7749 | 0.9939 |
59
- | 0.0033 | 3.0 | 1077 | 0.0291 | {'precision': 0.7560975609756098, 'recall': 0.8857142857142857, 'f1': 0.8157894736842105, 'number': 35} | {'precision': 0.5894039735099338, 'recall': 0.7063492063492064, 'f1': 0.6425992779783394, 'number': 126} | {'precision': 0.7835497835497836, 'recall': 0.8457943925233645, 'f1': 0.8134831460674157, 'number': 214} | {'precision': 0.8385542168674699, 'recall': 0.8446601941747572, 'f1': 0.8415961305925029, 'number': 412} | 0.7745 | 0.8247 | 0.7988 | 0.9943 |
60
- | 0.0018 | 4.0 | 1436 | 0.0315 | {'precision': 0.7804878048780488, 'recall': 0.9142857142857143, 'f1': 0.8421052631578947, 'number': 35} | {'precision': 0.647887323943662, 'recall': 0.7301587301587301, 'f1': 0.6865671641791045, 'number': 126} | {'precision': 0.7695473251028807, 'recall': 0.8738317757009346, 'f1': 0.8183807439824945, 'number': 214} | {'precision': 0.8459657701711492, 'recall': 0.8398058252427184, 'f1': 0.8428745432399514, 'number': 412} | 0.7868 | 0.8348 | 0.8101 | 0.9946 |
61
- | 0.0012 | 5.0 | 1795 | 0.0313 | {'precision': 0.8, 'recall': 0.9142857142857143, 'f1': 0.8533333333333333, 'number': 35} | {'precision': 0.6180555555555556, 'recall': 0.7063492063492064, 'f1': 0.6592592592592594, 'number': 126} | {'precision': 0.7419354838709677, 'recall': 0.8598130841121495, 'f1': 0.7965367965367967, 'number': 214} | {'precision': 0.8409638554216867, 'recall': 0.8470873786407767, 'f1': 0.8440145102781136, 'number': 412} | 0.7721 | 0.8310 | 0.8005 | 0.9945 |
 
62
 
63
 
64
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [dmis-lab/biobert-base-cased-v1.1](https://huggingface.co/dmis-lab/biobert-base-cased-v1.1) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.0386
19
  - Compositemention: {'precision': 0.8, 'recall': 0.9142857142857143, 'f1': 0.8533333333333333, 'number': 35}
20
+ - Diseaseclass: {'precision': 0.5341614906832298, 'recall': 0.6825396825396826, 'f1': 0.5993031358885017, 'number': 126}
21
+ - Modifier: {'precision': 0.7014925373134329, 'recall': 0.8785046728971962, 'f1': 0.7800829875518672, 'number': 214}
22
+ - Specificdisease: {'precision': 0.8254716981132075, 'recall': 0.8495145631067961, 'f1': 0.8373205741626795, 'number': 412}
23
+ - Overall Precision: 0.7346
24
+ - Overall Recall: 0.8335
25
+ - Overall F1: 0.7810
26
+ - Overall Accuracy: 0.9934
27
 
28
  ## Model description
29
 
 
48
  - seed: 42
49
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
50
  - lr_scheduler_type: linear
51
+ - num_epochs: 20
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Compositemention | Diseaseclass | Modifier | Specificdisease | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
56
  |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
57
+ | 0.005 | 1.0 | 359 | 0.0271 | {'precision': 0.725, 'recall': 0.8285714285714286, 'f1': 0.7733333333333333, 'number': 35} | {'precision': 0.5263157894736842, 'recall': 0.7142857142857143, 'f1': 0.6060606060606061, 'number': 126} | {'precision': 0.7198443579766537, 'recall': 0.8644859813084113, 'f1': 0.7855626326963906, 'number': 214} | {'precision': 0.8103448275862069, 'recall': 0.7985436893203883, 'f1': 0.80440097799511, 'number': 412} | 0.7243 | 0.8043 | 0.7622 | 0.9939 |
58
+ | 0.0049 | 2.0 | 718 | 0.0277 | {'precision': 0.775, 'recall': 0.8857142857142857, 'f1': 0.8266666666666667, 'number': 35} | {'precision': 0.5891472868217055, 'recall': 0.6031746031746031, 'f1': 0.596078431372549, 'number': 126} | {'precision': 0.7195121951219512, 'recall': 0.8271028037383178, 'f1': 0.7695652173913042, 'number': 214} | {'precision': 0.7707423580786026, 'recall': 0.8567961165048543, 'f1': 0.8114942528735631, 'number': 412} | 0.7297 | 0.8094 | 0.7675 | 0.9934 |
59
+ | 0.0031 | 3.0 | 1077 | 0.0330 | {'precision': 0.7142857142857143, 'recall': 0.8571428571428571, 'f1': 0.7792207792207793, 'number': 35} | {'precision': 0.4939759036144578, 'recall': 0.6507936507936508, 'f1': 0.5616438356164383, 'number': 126} | {'precision': 0.7368421052631579, 'recall': 0.8504672897196262, 'f1': 0.789587852494577, 'number': 214} | {'precision': 0.8076923076923077, 'recall': 0.866504854368932, 'f1': 0.8360655737704917, 'number': 412} | 0.7258 | 0.8272 | 0.7732 | 0.9935 |
60
+ | 0.0008 | 4.0 | 1436 | 0.0324 | {'precision': 0.7567567567567568, 'recall': 0.8, 'f1': 0.7777777777777778, 'number': 35} | {'precision': 0.6014492753623188, 'recall': 0.6587301587301587, 'f1': 0.6287878787878789, 'number': 126} | {'precision': 0.746938775510204, 'recall': 0.8551401869158879, 'f1': 0.7973856209150327, 'number': 214} | {'precision': 0.8389423076923077, 'recall': 0.8470873786407767, 'f1': 0.8429951690821257, 'number': 412} | 0.7691 | 0.8170 | 0.7924 | 0.9940 |
61
+ | 0.0019 | 5.0 | 1795 | 0.0314 | {'precision': 0.7804878048780488, 'recall': 0.9142857142857143, 'f1': 0.8421052631578947, 'number': 35} | {'precision': 0.6356589147286822, 'recall': 0.6507936507936508, 'f1': 0.6431372549019608, 'number': 126} | {'precision': 0.7615062761506276, 'recall': 0.8504672897196262, 'f1': 0.8035320088300221, 'number': 214} | {'precision': 0.8148148148148148, 'recall': 0.8543689320388349, 'f1': 0.8341232227488151, 'number': 412} | 0.7705 | 0.8234 | 0.7961 | 0.9939 |
62
+ | 0.0017 | 6.0 | 2154 | 0.0386 | {'precision': 0.8, 'recall': 0.9142857142857143, 'f1': 0.8533333333333333, 'number': 35} | {'precision': 0.5341614906832298, 'recall': 0.6825396825396826, 'f1': 0.5993031358885017, 'number': 126} | {'precision': 0.7014925373134329, 'recall': 0.8785046728971962, 'f1': 0.7800829875518672, 'number': 214} | {'precision': 0.8254716981132075, 'recall': 0.8495145631067961, 'f1': 0.8373205741626795, 'number': 412} | 0.7346 | 0.8335 | 0.7810 | 0.9934 |
63
 
64
 
65
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4196bc272f5ed7f137471564300fb9ce881c3eba5f290ceb764e900b381b1889
3
  size 430929740
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:82d9da89bff069b6a0418e5e0df8233fe3833053f7b7308b37cd31d8a4c3aa91
3
  size 430929740
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f800bae0b44cafbf052577115bca94e380b4389343014ef3e981ce40451d4c27
3
  size 5240
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8606adc4ad54e20931bdbf03a89c6dd909f34ef6d6447a790a87c6387ec58ba3
3
  size 5240