File size: 10,460 Bytes
080ffd7
 
 
 
f708a6e
080ffd7
 
 
 
 
 
 
 
3df5f1b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
080ffd7
 
 
 
 
 
 
 
13e2e08
080ffd7
13e2e08
080ffd7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
---
license: mit
tags:
- generated_from_trainer
base_model: law-ai/InLegalBERT
metrics:
- accuracy
- precision
- recall
model-index:
- name: case-analysis-InLegalBERT
  results: []
---
## Metrics

- loss: 1.0434
- accuracy: 0.8218
- precision: 0.8145
- recall: 0.8218
- precision_macro: 0.6907
- recall_macro: 0.6533
- macro_fpr: 0.0897
- weighted_fpr: 0.0674
- weighted_specificity: 0.8528
- macro_specificity: 0.9187
- weighted_sensitivity: 0.8218
- macro_sensitivity: 0.6533
- f1_micro: 0.8218
- f1_macro: 0.6690
- f1_weighted: 0.8159
- runtime: 198.6459
- samples_per_second: 2.2600
- steps_per_second: 0.2870


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# case-analysis-InLegalBERT

This model is a fine-tuned version of [law-ai/InLegalBERT](https://huggingface.co/law-ai/InLegalBERT) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0434
- Accuracy: 0.8218
- Precision: 0.8145
- Recall: 0.8218
- Precision Macro: 0.6439
- Recall Macro: 0.6295
- Macro Fpr: 0.0890
- Weighted Fpr: 0.0674
- Weighted Specificity: 0.8544
- Macro Specificity: 0.9191
- Weighted Sensitivity: 0.8218
- Macro Sensitivity: 0.6295
- F1 Micro: 0.8218
- F1 Macro: 0.6335
- F1 Weighted: 0.8106

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| No log        | 1.0   | 224  | 0.6546          | 0.8018   | 0.7632    | 0.8018 | 0.5777          | 0.6106       | 0.0978    | 0.0761       | 0.8432               | 0.9112            | 0.8018               | 0.6106            | 0.8018   | 0.5936   | 0.7820      |
| No log        | 2.0   | 448  | 0.6831          | 0.8129   | 0.7732    | 0.8129 | 0.5845          | 0.6154       | 0.0923    | 0.0712       | 0.8554               | 0.9171            | 0.8129               | 0.6154            | 0.8129   | 0.5996   | 0.7926      |
| 0.607         | 3.0   | 672  | 0.7626          | 0.8263   | 0.8060    | 0.8263 | 0.6773          | 0.6341       | 0.0885    | 0.0655       | 0.8464               | 0.9182            | 0.8263               | 0.6341            | 0.8263   | 0.6362   | 0.8105      |
| 0.607         | 4.0   | 896  | 0.7839          | 0.8085   | 0.7991    | 0.8085 | 0.6391          | 0.6306       | 0.0896    | 0.0732       | 0.8754               | 0.9210            | 0.8085               | 0.6306            | 0.8085   | 0.6314   | 0.8017      |
| 0.316         | 5.0   | 1120 | 0.9381          | 0.8263   | 0.8127    | 0.8263 | 0.6688          | 0.6573       | 0.0822    | 0.0655       | 0.8780               | 0.9261            | 0.8263               | 0.6573            | 0.8263   | 0.6514   | 0.8161      |
| 0.316         | 6.0   | 1344 | 1.0434          | 0.8218   | 0.8145    | 0.8218 | 0.6907          | 0.6533       | 0.0897    | 0.0674       | 0.8528               | 0.9187            | 0.8218               | 0.6533            | 0.8218   | 0.6690   | 0.8159      |
| 0.1513        | 7.0   | 1568 | 1.2182          | 0.8018   | 0.8066    | 0.8018 | 0.6382          | 0.6399       | 0.0916    | 0.0761       | 0.8802               | 0.9205            | 0.8018               | 0.6399            | 0.8018   | 0.6375   | 0.8030      |
| 0.1513        | 8.0   | 1792 | 1.3193          | 0.8285   | 0.8070    | 0.8285 | 0.6566          | 0.6280       | 0.0882    | 0.0645       | 0.8521               | 0.9202            | 0.8285               | 0.6280            | 0.8285   | 0.6376   | 0.8152      |
| 0.0491        | 9.0   | 2016 | 1.3169          | 0.8330   | 0.8180    | 0.8330 | 0.6950          | 0.6555       | 0.0828    | 0.0627       | 0.8653               | 0.9246            | 0.8330               | 0.6555            | 0.8330   | 0.6687   | 0.8235      |
| 0.0491        | 10.0  | 2240 | 1.4460          | 0.8307   | 0.8109    | 0.8307 | 0.6584          | 0.6291       | 0.0868    | 0.0636       | 0.8533               | 0.9210            | 0.8307               | 0.6291            | 0.8307   | 0.6398   | 0.8184      |
| 0.0491        | 11.0  | 2464 | 1.4100          | 0.8419   | 0.8166    | 0.8419 | 0.6718          | 0.6399       | 0.0806    | 0.0589       | 0.8642               | 0.9265            | 0.8419               | 0.6399            | 0.8419   | 0.6464   | 0.8263      |
| 0.0148        | 12.0  | 2688 | 1.5364          | 0.8218   | 0.8105    | 0.8218 | 0.6661          | 0.6340       | 0.0903    | 0.0674       | 0.8505               | 0.9181            | 0.8218               | 0.6340            | 0.8218   | 0.6469   | 0.8137      |
| 0.0148        | 13.0  | 2912 | 1.5380          | 0.8307   | 0.8118    | 0.8307 | 0.6596          | 0.6304       | 0.0870    | 0.0636       | 0.8512               | 0.9205            | 0.8307               | 0.6304            | 0.8307   | 0.6409   | 0.8185      |
| 0.0031        | 14.0  | 3136 | 1.6139          | 0.8218   | 0.8108    | 0.8218 | 0.6451          | 0.6353       | 0.0860    | 0.0674       | 0.8685               | 0.9226            | 0.8218               | 0.6353            | 0.8218   | 0.6396   | 0.8159      |
| 0.0031        | 15.0  | 3360 | 1.6356          | 0.8263   | 0.8117    | 0.8263 | 0.6626          | 0.6477       | 0.0842    | 0.0655       | 0.8700               | 0.9241            | 0.8263               | 0.6477            | 0.8263   | 0.6529   | 0.8183      |
| 0.0043        | 16.0  | 3584 | 1.6745          | 0.8241   | 0.7994    | 0.8241 | 0.6244          | 0.6229       | 0.0884    | 0.0664       | 0.8543               | 0.9196            | 0.8241               | 0.6229            | 0.8241   | 0.6231   | 0.8108      |
| 0.0043        | 17.0  | 3808 | 1.7867          | 0.8085   | 0.7946    | 0.8085 | 0.6221          | 0.6336       | 0.0906    | 0.0732       | 0.8678               | 0.9191            | 0.8085               | 0.6336            | 0.8085   | 0.6229   | 0.7996      |
| 0.0008        | 18.0  | 4032 | 1.7511          | 0.8151   | 0.7971    | 0.8151 | 0.6110          | 0.6216       | 0.0901    | 0.0703       | 0.8644               | 0.9199            | 0.8151               | 0.6216            | 0.8151   | 0.6145   | 0.8046      |
| 0.0008        | 19.0  | 4256 | 1.5909          | 0.8441   | 0.8079    | 0.8441 | 0.6260          | 0.6374       | 0.0792    | 0.0580       | 0.8670               | 0.9278            | 0.8441               | 0.6374            | 0.8441   | 0.6311   | 0.8249      |
| 0.0008        | 20.0  | 4480 | 1.5721          | 0.8463   | 0.8212    | 0.8463 | 0.6727          | 0.6546       | 0.0761    | 0.0571       | 0.8753               | 0.9304            | 0.8463               | 0.6546            | 0.8463   | 0.6547   | 0.8316      |
| 0.0039        | 21.0  | 4704 | 1.5819          | 0.8396   | 0.8054    | 0.8396 | 0.6337          | 0.6200       | 0.0843    | 0.0599       | 0.8527               | 0.9231            | 0.8396               | 0.6200            | 0.8396   | 0.6245   | 0.8199      |
| 0.0039        | 22.0  | 4928 | 1.5906          | 0.8486   | 0.8236    | 0.8486 | 0.6814          | 0.6512       | 0.0770    | 0.0562       | 0.8680               | 0.9291            | 0.8486               | 0.6512            | 0.8486   | 0.6570   | 0.8333      |
| 0.0005        | 23.0  | 5152 | 1.7133          | 0.8263   | 0.8047    | 0.8263 | 0.6403          | 0.6431       | 0.0831    | 0.0655       | 0.8745               | 0.9252            | 0.8263               | 0.6431            | 0.8263   | 0.6367   | 0.8143      |
| 0.0005        | 24.0  | 5376 | 1.7813          | 0.8241   | 0.8022    | 0.8241 | 0.6515          | 0.6290       | 0.0894    | 0.0664       | 0.8490               | 0.9183            | 0.8241               | 0.6290            | 0.8241   | 0.6348   | 0.8108      |
| 0.0033        | 25.0  | 5600 | 1.7983          | 0.8218   | 0.8001    | 0.8218 | 0.6485          | 0.6281       | 0.0902    | 0.0674       | 0.8486               | 0.9176            | 0.8218               | 0.6281            | 0.8218   | 0.6328   | 0.8088      |
| 0.0033        | 26.0  | 5824 | 1.8070          | 0.8218   | 0.8001    | 0.8218 | 0.6485          | 0.6281       | 0.0902    | 0.0674       | 0.8486               | 0.9176            | 0.8218               | 0.6281            | 0.8218   | 0.6328   | 0.8088      |
| 0.0           | 27.0  | 6048 | 1.8198          | 0.8218   | 0.8024    | 0.8218 | 0.6439          | 0.6295       | 0.0890    | 0.0674       | 0.8544               | 0.9191            | 0.8218               | 0.6295            | 0.8218   | 0.6335   | 0.8106      |
| 0.0           | 28.0  | 6272 | 1.8243          | 0.8218   | 0.8024    | 0.8218 | 0.6439          | 0.6295       | 0.0890    | 0.0674       | 0.8544               | 0.9191            | 0.8218               | 0.6295            | 0.8218   | 0.6335   | 0.8106      |
| 0.0           | 29.0  | 6496 | 1.8277          | 0.8218   | 0.8024    | 0.8218 | 0.6439          | 0.6295       | 0.0890    | 0.0674       | 0.8544               | 0.9191            | 0.8218               | 0.6295            | 0.8218   | 0.6335   | 0.8106      |
| 0.0003        | 30.0  | 6720 | 1.8292          | 0.8218   | 0.8024    | 0.8218 | 0.6439          | 0.6295       | 0.0890    | 0.0674       | 0.8544               | 0.9191            | 0.8218               | 0.6295            | 0.8218   | 0.6335   | 0.8106      |


### Framework versions

- Transformers 4.39.3
- Pytorch 2.2.1+cu121
- Datasets 2.19.1
- Tokenizers 0.15.2