File size: 10,082 Bytes
5d24ded
 
 
 
64bbbb3
5d24ded
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
---
license: apache-2.0
tags:
- generated_from_trainer
base_model: google-bert/bert-base-uncased
metrics:
- accuracy
- precision
- recall
model-index:
- name: case-analysis-bert-base-uncased
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# case-analysis-bert-base-uncased

This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8540
- Accuracy: 0.8129
- Precision: 0.7975
- Recall: 0.8129
- Precision Macro: 0.6427
- Recall Macro: 0.6184
- Macro Fpr: 0.0946
- Weighted Fpr: 0.0712
- Weighted Specificity: 0.8449
- Macro Specificity: 0.9145
- Weighted Sensitivity: 0.8129
- Macro Sensitivity: 0.6184
- F1 Micro: 0.8129
- F1 Macro: 0.6284
- F1 Weighted: 0.8035

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| No log        | 1.0   | 224  | 0.7283          | 0.7862   | 0.7487    | 0.7862 | 0.5848          | 0.5572       | 0.1142    | 0.0831       | 0.8036               | 0.8974            | 0.7862               | 0.5572            | 0.7862   | 0.5606   | 0.7597      |
| No log        | 2.0   | 448  | 0.8160          | 0.7996   | 0.7603    | 0.7996 | 0.5770          | 0.6065       | 0.0997    | 0.0771       | 0.8417               | 0.9103            | 0.7996               | 0.6065            | 0.7996   | 0.5914   | 0.7794      |
| 0.6512        | 3.0   | 672  | 0.8588          | 0.7906   | 0.7598    | 0.7906 | 0.5770          | 0.5989       | 0.1005    | 0.0811       | 0.8512               | 0.9105            | 0.7906               | 0.5989            | 0.7906   | 0.5840   | 0.7720      |
| 0.6512        | 4.0   | 896  | 1.0821          | 0.7817   | 0.7819    | 0.7817 | 0.6214          | 0.6429       | 0.0996    | 0.0851       | 0.8679               | 0.9124            | 0.7817               | 0.6429            | 0.7817   | 0.6299   | 0.7805      |
| 0.3466        | 5.0   | 1120 | 1.0612          | 0.8085   | 0.7999    | 0.8085 | 0.7129          | 0.6263       | 0.0948    | 0.0732       | 0.8470               | 0.9139            | 0.8085               | 0.6263            | 0.8085   | 0.6195   | 0.7928      |
| 0.3466        | 6.0   | 1344 | 1.2559          | 0.7929   | 0.7877    | 0.7929 | 0.6206          | 0.6362       | 0.0951    | 0.0801       | 0.8717               | 0.9161            | 0.7929               | 0.6362            | 0.7929   | 0.6273   | 0.7897      |
| 0.1715        | 7.0   | 1568 | 1.3701          | 0.7929   | 0.7889    | 0.7929 | 0.6345          | 0.6179       | 0.0991    | 0.0801       | 0.8558               | 0.9122            | 0.7929               | 0.6179            | 0.7929   | 0.6237   | 0.7893      |
| 0.1715        | 8.0   | 1792 | 1.4005          | 0.8107   | 0.8035    | 0.8107 | 0.6578          | 0.6370       | 0.0922    | 0.0722       | 0.8607               | 0.9179            | 0.8107               | 0.6370            | 0.8107   | 0.6464   | 0.8064      |
| 0.0636        | 9.0   | 2016 | 1.4737          | 0.8018   | 0.7881    | 0.8018 | 0.6583          | 0.6149       | 0.1026    | 0.0761       | 0.8271               | 0.9072            | 0.8018               | 0.6149            | 0.8018   | 0.6263   | 0.7896      |
| 0.0636        | 10.0  | 2240 | 1.7569          | 0.7884   | 0.7962    | 0.7884 | 0.6275          | 0.6428       | 0.0960    | 0.0821       | 0.8750               | 0.9158            | 0.7884               | 0.6428            | 0.7884   | 0.6332   | 0.7909      |
| 0.0636        | 11.0  | 2464 | 1.7141          | 0.7906   | 0.7824    | 0.7906 | 0.6166          | 0.6083       | 0.1035    | 0.0811       | 0.8424               | 0.9083            | 0.7906               | 0.6083            | 0.7906   | 0.6101   | 0.7845      |
| 0.0159        | 12.0  | 2688 | 1.7144          | 0.7951   | 0.7914    | 0.7951 | 0.6393          | 0.6413       | 0.0969    | 0.0791       | 0.8610               | 0.9140            | 0.7951               | 0.6413            | 0.7951   | 0.6373   | 0.7917      |
| 0.0159        | 13.0  | 2912 | 1.7243          | 0.7996   | 0.7969    | 0.7996 | 0.6535          | 0.6526       | 0.0942    | 0.0771       | 0.8638               | 0.9158            | 0.7996               | 0.6526            | 0.7996   | 0.6529   | 0.7982      |
| 0.0043        | 14.0  | 3136 | 1.8551          | 0.7973   | 0.7948    | 0.7973 | 0.6576          | 0.6189       | 0.1041    | 0.0781       | 0.8314               | 0.9072            | 0.7973               | 0.6189            | 0.7973   | 0.6335   | 0.7912      |
| 0.0043        | 15.0  | 3360 | 1.8841          | 0.7929   | 0.7869    | 0.7929 | 0.6154          | 0.6162       | 0.1008    | 0.0801       | 0.8511               | 0.9110            | 0.7929               | 0.6162            | 0.7929   | 0.6104   | 0.7861      |
| 0.0029        | 16.0  | 3584 | 2.0853          | 0.7550   | 0.7837    | 0.7550 | 0.6010          | 0.6119       | 0.1100    | 0.0976       | 0.8698               | 0.9062            | 0.7550               | 0.6119            | 0.7550   | 0.6015   | 0.7661      |
| 0.0029        | 17.0  | 3808 | 1.9722          | 0.7840   | 0.7783    | 0.7840 | 0.6018          | 0.5839       | 0.1076    | 0.0841       | 0.8394               | 0.9059            | 0.7840               | 0.5839            | 0.7840   | 0.5917   | 0.7797      |
| 0.0071        | 18.0  | 4032 | 1.8735          | 0.7996   | 0.7783    | 0.7996 | 0.6086          | 0.5917       | 0.1053    | 0.0771       | 0.8193               | 0.9047            | 0.7996               | 0.5917            | 0.7996   | 0.5960   | 0.7840      |
| 0.0071        | 19.0  | 4256 | 1.8294          | 0.8018   | 0.7840    | 0.8018 | 0.6114          | 0.5943       | 0.1025    | 0.0761       | 0.8308               | 0.9082            | 0.8018               | 0.5943            | 0.8018   | 0.6001   | 0.7895      |
| 0.0071        | 20.0  | 4480 | 1.8578          | 0.7973   | 0.7939    | 0.7973 | 0.6367          | 0.6232       | 0.0990    | 0.0781       | 0.8497               | 0.9118            | 0.7973               | 0.6232            | 0.7973   | 0.6285   | 0.7942      |
| 0.0049        | 21.0  | 4704 | 1.8770          | 0.7973   | 0.7939    | 0.7973 | 0.6367          | 0.6232       | 0.0990    | 0.0781       | 0.8497               | 0.9118            | 0.7973               | 0.6232            | 0.7973   | 0.6285   | 0.7942      |
| 0.0049        | 22.0  | 4928 | 1.8932          | 0.7951   | 0.7876    | 0.7951 | 0.6219          | 0.6119       | 0.1007    | 0.0791       | 0.8461               | 0.9103            | 0.7951               | 0.6119            | 0.7951   | 0.6155   | 0.7900      |
| 0.0015        | 23.0  | 5152 | 1.9834          | 0.7996   | 0.7965    | 0.7996 | 0.6441          | 0.6389       | 0.0960    | 0.0771       | 0.8599               | 0.9149            | 0.7996               | 0.6389            | 0.7996   | 0.6403   | 0.7971      |
| 0.0015        | 24.0  | 5376 | 1.9926          | 0.8018   | 0.7984    | 0.8018 | 0.6468          | 0.6399       | 0.0952    | 0.0761       | 0.8603               | 0.9155            | 0.8018               | 0.6399            | 0.8018   | 0.6422   | 0.7991      |
| 0.0001        | 25.0  | 5600 | 1.9771          | 0.7973   | 0.7790    | 0.7973 | 0.6025          | 0.6024       | 0.1011    | 0.0781       | 0.8420               | 0.9098            | 0.7973               | 0.6024            | 0.7973   | 0.6017   | 0.7871      |
| 0.0001        | 26.0  | 5824 | 1.9871          | 0.7951   | 0.7770    | 0.7951 | 0.5996          | 0.6015       | 0.1020    | 0.0791       | 0.8416               | 0.9092            | 0.7951               | 0.6015            | 0.7951   | 0.5997   | 0.7850      |
| 0.0           | 27.0  | 6048 | 1.8756          | 0.8129   | 0.7961    | 0.8129 | 0.6440          | 0.6200       | 0.0939    | 0.0712       | 0.8462               | 0.9148            | 0.8129               | 0.6200            | 0.8129   | 0.6293   | 0.8029      |
| 0.0           | 28.0  | 6272 | 1.8473          | 0.8151   | 0.7998    | 0.8151 | 0.6463          | 0.6194       | 0.0937    | 0.0703       | 0.8453               | 0.9151            | 0.8151               | 0.6194            | 0.8151   | 0.6305   | 0.8056      |
| 0.0           | 29.0  | 6496 | 1.8525          | 0.8129   | 0.7975    | 0.8129 | 0.6427          | 0.6184       | 0.0946    | 0.0712       | 0.8449               | 0.9145            | 0.8129               | 0.6184            | 0.8129   | 0.6284   | 0.8035      |
| 0.0001        | 30.0  | 6720 | 1.8540          | 0.8129   | 0.7975    | 0.8129 | 0.6427          | 0.6184       | 0.0946    | 0.0712       | 0.8449               | 0.9145            | 0.8129               | 0.6184            | 0.8129   | 0.6284   | 0.8035      |


### Framework versions

- Transformers 4.40.1
- Pytorch 2.2.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1