parlange commited on
Commit
71258c3
·
verified ·
1 Parent(s): 5ab21e5

Upload CvT model from experiment a2

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +2 -0
  2. README.md +161 -0
  3. config.json +76 -0
  4. confusion_matrices/CvT_Confusion_Matrix_a.png +0 -0
  5. confusion_matrices/CvT_Confusion_Matrix_b.png +0 -0
  6. confusion_matrices/CvT_Confusion_Matrix_c.png +0 -0
  7. confusion_matrices/CvT_Confusion_Matrix_d.png +0 -0
  8. confusion_matrices/CvT_Confusion_Matrix_e.png +0 -0
  9. confusion_matrices/CvT_Confusion_Matrix_f.png +0 -0
  10. confusion_matrices/CvT_Confusion_Matrix_g.png +0 -0
  11. confusion_matrices/CvT_Confusion_Matrix_h.png +0 -0
  12. confusion_matrices/CvT_Confusion_Matrix_i.png +0 -0
  13. confusion_matrices/CvT_Confusion_Matrix_j.png +0 -0
  14. confusion_matrices/CvT_Confusion_Matrix_k.png +0 -0
  15. confusion_matrices/CvT_Confusion_Matrix_l.png +0 -0
  16. cvt-gravit-a2.pth +3 -0
  17. evaluation_results.csv +133 -0
  18. model.safetensors +3 -0
  19. pytorch_model.bin +3 -0
  20. roc_confusion_matrix/CvT_roc_confusion_matrix_a.png +0 -0
  21. roc_confusion_matrix/CvT_roc_confusion_matrix_b.png +0 -0
  22. roc_confusion_matrix/CvT_roc_confusion_matrix_c.png +0 -0
  23. roc_confusion_matrix/CvT_roc_confusion_matrix_d.png +0 -0
  24. roc_confusion_matrix/CvT_roc_confusion_matrix_e.png +0 -0
  25. roc_confusion_matrix/CvT_roc_confusion_matrix_f.png +0 -0
  26. roc_confusion_matrix/CvT_roc_confusion_matrix_g.png +0 -0
  27. roc_confusion_matrix/CvT_roc_confusion_matrix_h.png +0 -0
  28. roc_confusion_matrix/CvT_roc_confusion_matrix_i.png +0 -0
  29. roc_confusion_matrix/CvT_roc_confusion_matrix_j.png +0 -0
  30. roc_confusion_matrix/CvT_roc_confusion_matrix_k.png +0 -0
  31. roc_confusion_matrix/CvT_roc_confusion_matrix_l.png +0 -0
  32. roc_curves/CvT_ROC_a.png +0 -0
  33. roc_curves/CvT_ROC_b.png +0 -0
  34. roc_curves/CvT_ROC_c.png +0 -0
  35. roc_curves/CvT_ROC_d.png +0 -0
  36. roc_curves/CvT_ROC_e.png +0 -0
  37. roc_curves/CvT_ROC_f.png +0 -0
  38. roc_curves/CvT_ROC_g.png +0 -0
  39. roc_curves/CvT_ROC_h.png +0 -0
  40. roc_curves/CvT_ROC_i.png +0 -0
  41. roc_curves/CvT_ROC_j.png +0 -0
  42. roc_curves/CvT_ROC_k.png +0 -0
  43. roc_curves/CvT_ROC_l.png +0 -0
  44. training_curves/CvT_accuracy.png +0 -0
  45. training_curves/CvT_auc.png +0 -0
  46. training_curves/CvT_combined_metrics.png +3 -0
  47. training_curves/CvT_f1.png +0 -0
  48. training_curves/CvT_loss.png +0 -0
  49. training_curves/CvT_metrics.csv +85 -0
  50. training_metrics.csv +85 -0
.gitattributes CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ training_curves/CvT_combined_metrics.png filter=lfs diff=lfs merge=lfs -text
37
+ training_notebook_a2.ipynb filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,161 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - vision-transformer
5
+ - image-classification
6
+ - pytorch
7
+ - timm
8
+ - cvt
9
+ - gravitational-lensing
10
+ - strong-lensing
11
+ - astronomy
12
+ - astrophysics
13
+ datasets:
14
+ - C21
15
+ metrics:
16
+ - accuracy
17
+ - auc
18
+ - f1
19
+ model-index:
20
+ - name: CvT-a2
21
+ results:
22
+ - task:
23
+ type: image-classification
24
+ name: Strong Gravitational Lens Discovery
25
+ dataset:
26
+ type: common-test-sample
27
+ name: Common Test Sample (More et al. 2024)
28
+ metrics:
29
+ - type: accuracy
30
+ value: 0.6977
31
+ name: Average Accuracy
32
+ - type: auc
33
+ value: 0.7640
34
+ name: Average AUC-ROC
35
+ - type: f1
36
+ value: 0.4306
37
+ name: Average F1-Score
38
+ ---
39
+
40
+ # 🌌 cvt-gravit-a2
41
+
42
+ 🔭 This model is part of **GraViT**: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery
43
+
44
+ 🔗 **GitHub Repository**: [https://github.com/parlange/gravit](https://github.com/parlange/gravit)
45
+
46
+ ## 🛰️ Model Details
47
+
48
+ - **🤖 Model Type**: CvT
49
+ - **🧪 Experiment**: A2 - C21-half
50
+ - **🌌 Dataset**: C21
51
+ - **🪐 Fine-tuning Strategy**: half
52
+
53
+
54
+
55
+ ## 💻 Quick Start
56
+
57
+ ```python
58
+ import torch
59
+ import timm
60
+
61
+ # Load the model directly from the Hub
62
+ model = timm.create_model(
63
+ 'hf-hub:parlange/cvt-gravit-a2',
64
+ pretrained=True
65
+ )
66
+ model.eval()
67
+
68
+ # Example inference
69
+ dummy_input = torch.randn(1, 3, 224, 224)
70
+ with torch.no_grad():
71
+ output = model(dummy_input)
72
+ predictions = torch.softmax(output, dim=1)
73
+ print(f"Lens probability: {predictions[0][1]:.4f}")
74
+ ```
75
+
76
+ ## ⚡️ Training Configuration
77
+
78
+ **Training Dataset:** C21 (Cañameras et al. 2021)
79
+ **Fine-tuning Strategy:** half
80
+
81
+
82
+ | 🔧 Parameter | 📝 Value |
83
+ |--------------|----------|
84
+ | Batch Size | 192 |
85
+ | Learning Rate | AdamW with ReduceLROnPlateau |
86
+ | Epochs | 100 |
87
+ | Patience | 10 |
88
+ | Optimizer | AdamW |
89
+ | Scheduler | ReduceLROnPlateau |
90
+ | Image Size | 224x224 |
91
+ | Fine Tune Mode | half |
92
+ | Stochastic Depth Probability | 0.1 |
93
+
94
+
95
+ ## 📈 Training Curves
96
+
97
+ ![Combined Training Metrics](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/training_curves/CvT_combined_metrics.png)
98
+
99
+
100
+ ## 🏁 Final Epoch Training Metrics
101
+
102
+ | Metric | Training | Validation |
103
+ |:---------:|:-----------:|:-------------:|
104
+ | 📉 Loss | 0.3027 | 0.2143 |
105
+ | 🎯 Accuracy | 0.8342 | 0.9130 |
106
+ | 📊 AUC-ROC | 0.9366 | 0.9737 |
107
+ | ⚖️ F1 Score | 0.8355 | 0.9150 |
108
+
109
+
110
+ ## ☑️ Evaluation Results
111
+
112
+ ### ROC Curves and Confusion Matrices
113
+
114
+ Performance across all test datasets (a through l) in the Common Test Sample (More et al. 2024):
115
+
116
+ ![ROC + Confusion Matrix - Dataset A](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_a.png)
117
+ ![ROC + Confusion Matrix - Dataset B](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_b.png)
118
+ ![ROC + Confusion Matrix - Dataset C](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_c.png)
119
+ ![ROC + Confusion Matrix - Dataset D](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_d.png)
120
+ ![ROC + Confusion Matrix - Dataset E](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_e.png)
121
+ ![ROC + Confusion Matrix - Dataset F](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_f.png)
122
+ ![ROC + Confusion Matrix - Dataset G](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_g.png)
123
+ ![ROC + Confusion Matrix - Dataset H](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_h.png)
124
+ ![ROC + Confusion Matrix - Dataset I](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_i.png)
125
+ ![ROC + Confusion Matrix - Dataset J](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_j.png)
126
+ ![ROC + Confusion Matrix - Dataset K](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_k.png)
127
+ ![ROC + Confusion Matrix - Dataset L](https://huggingface.co/parlange/cvt-gravit-a2/resolve/main/roc_confusion_matrix/CvT_roc_confusion_matrix_l.png)
128
+
129
+ ### 📋 Performance Summary
130
+
131
+ Average performance across 12 test datasets from the Common Test Sample (More et al. 2024):
132
+
133
+ | Metric | Value |
134
+ |-----------|----------|
135
+ | 🎯 Average Accuracy | 0.6977 |
136
+ | 📈 Average AUC-ROC | 0.7640 |
137
+ | ⚖️ Average F1-Score | 0.4306 |
138
+
139
+
140
+ ## 📘 Citation
141
+
142
+ If you use this model in your research, please cite:
143
+
144
+ ```bibtex
145
+ @misc{parlange2025gravit,
146
+ title={GraViT: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery},
147
+ author={René Parlange and Juan C. Cuevas-Tello and Octavio Valenzuela and Omar de J. Cabrera-Rosas and Tomás Verdugo and Anupreeta More and Anton T. Jaelani},
148
+ year={2025},
149
+ eprint={2509.00226},
150
+ archivePrefix={arXiv},
151
+ primaryClass={cs.CV},
152
+ url={https://arxiv.org/abs/2509.00226},
153
+ }
154
+ ```
155
+
156
+ ---
157
+
158
+
159
+ ## Model Card Contact
160
+
161
+ For questions about this model, please contact the author through: https://github.com/parlange/
config.json ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architecture": "cvt_13_224",
3
+ "num_classes": 2,
4
+ "num_features": 1000,
5
+ "global_pool": "avg",
6
+ "crop_pct": 0.875,
7
+ "interpolation": "bicubic",
8
+ "mean": [
9
+ 0.485,
10
+ 0.456,
11
+ 0.406
12
+ ],
13
+ "std": [
14
+ 0.229,
15
+ 0.224,
16
+ 0.225
17
+ ],
18
+ "first_conv": "conv1",
19
+ "classifier": "fc",
20
+ "input_size": [
21
+ 3,
22
+ 224,
23
+ 224
24
+ ],
25
+ "pool_size": [
26
+ 7,
27
+ 7
28
+ ],
29
+ "pretrained_cfg": {
30
+ "tag": "gravit_a2",
31
+ "custom_load": false,
32
+ "input_size": [
33
+ 3,
34
+ 224,
35
+ 224
36
+ ],
37
+ "fixed_input_size": true,
38
+ "interpolation": "bicubic",
39
+ "crop_pct": 0.875,
40
+ "crop_mode": "center",
41
+ "mean": [
42
+ 0.485,
43
+ 0.456,
44
+ 0.406
45
+ ],
46
+ "std": [
47
+ 0.229,
48
+ 0.224,
49
+ 0.225
50
+ ],
51
+ "num_classes": 2,
52
+ "pool_size": [
53
+ 7,
54
+ 7
55
+ ],
56
+ "first_conv": "conv1",
57
+ "classifier": "fc"
58
+ },
59
+ "model_name": "cvt_gravit_a2",
60
+ "experiment": "a2",
61
+ "training_strategy": "half",
62
+ "dataset": "C21",
63
+ "hyperparameters": {
64
+ "batch_size": "192",
65
+ "learning_rate": "AdamW with ReduceLROnPlateau",
66
+ "epochs": "100",
67
+ "patience": "10",
68
+ "optimizer": "AdamW",
69
+ "scheduler": "ReduceLROnPlateau",
70
+ "image_size": "224x224",
71
+ "fine_tune_mode": "half",
72
+ "stochastic_depth_probability": "0.1"
73
+ },
74
+ "hf_hub_id": "parlange/cvt-gravit-a2",
75
+ "license": "apache-2.0"
76
+ }
confusion_matrices/CvT_Confusion_Matrix_a.png ADDED
confusion_matrices/CvT_Confusion_Matrix_b.png ADDED
confusion_matrices/CvT_Confusion_Matrix_c.png ADDED
confusion_matrices/CvT_Confusion_Matrix_d.png ADDED
confusion_matrices/CvT_Confusion_Matrix_e.png ADDED
confusion_matrices/CvT_Confusion_Matrix_f.png ADDED
confusion_matrices/CvT_Confusion_Matrix_g.png ADDED
confusion_matrices/CvT_Confusion_Matrix_h.png ADDED
confusion_matrices/CvT_Confusion_Matrix_i.png ADDED
confusion_matrices/CvT_Confusion_Matrix_j.png ADDED
confusion_matrices/CvT_Confusion_Matrix_k.png ADDED
confusion_matrices/CvT_Confusion_Matrix_l.png ADDED
cvt-gravit-a2.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0cd98b44ae73af3ae9975e9f116299918c40ddb0a3e4aa2de8be0b2071659493
3
+ size 125471131
evaluation_results.csv ADDED
@@ -0,0 +1,133 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Model,Dataset,Loss,Accuracy,AUCROC,F1
2
+ ViT,a,0.3987342550652614,0.8997170701037409,0.917220073664825,0.46921797004991683
3
+ ViT,b,0.37321944226074877,0.9182646966362779,0.9273655616942909,0.5202952029520295
4
+ ViT,c,0.821146962441052,0.7922037095253065,0.866000920810313,0.2990455991516437
5
+ ViT,d,0.223954888615208,0.9440427538509902,0.9518121546961327,0.6130434782608696
6
+ ViT,e,1.1019757730900652,0.7782656421514819,0.85117687126315,0.5826446280991735
7
+ ViT,f,0.43205039906387277,0.8869181318255751,0.9119053612426381,0.1618828932261768
8
+ ViT,g,0.1527516215024516,0.9611666666666666,0.9983996666666667,0.962461736748832
9
+ ViT,h,0.39022785605769605,0.8943333333333333,0.9959461111111112,0.9040556900726392
10
+ ViT,i,0.07361652948241681,0.9748333333333333,0.9992922222222222,0.9753469387755102
11
+ ViT,j,6.027309565424919,0.5033333333333333,0.49077061111111114,0.13872832369942195
12
+ ViT,k,5.948174474835396,0.517,0.5691439444444444,0.14209591474245115
13
+ ViT,l,2.1620761937842525,0.7761620221035376,0.7346207805818022,0.6140942656577628
14
+ MLP-Mixer,a,1.0354112619425808,0.7239861678717384,0.9444742173112339,0.2779605263157895
15
+ MLP-Mixer,b,0.882929647823281,0.7834014460861365,0.9520524861878453,0.3291139240506329
16
+ MLP-Mixer,c,1.6300886590238712,0.6205595724614901,0.9170009208103129,0.21877022653721684
17
+ MLP-Mixer,d,0.055560619117974934,0.9792518076076705,0.9959686924493554,0.8366336633663366
18
+ MLP-Mixer,e,1.1025473987755214,0.70801317233809,0.9327480511617346,0.5596026490066225
19
+ MLP-Mixer,f,0.9539809344458585,0.763147703508636,0.9512486274645963,0.09952885747938751
20
+ MLP-Mixer,g,0.4660766951590776,0.8855,0.9941798888888888,0.8969551522423879
21
+ MLP-Mixer,h,0.8621955468207598,0.7991666666666667,0.9892695555555555,0.8322894919972165
22
+ MLP-Mixer,i,0.027433219969272612,0.9893333333333333,0.9997324444444444,0.9894109861019192
23
+ MLP-Mixer,j,5.562473163604737,0.4046666666666667,0.2374668888888889,0.05552617662612375
24
+ MLP-Mixer,k,5.123829672321677,0.5085,0.47802688888888895,0.06647673314339982
25
+ MLP-Mixer,l,2.271000012816855,0.6846808735656497,0.6265659288601144,0.5226162837242815
26
+ CvT,a,0.7062503800231756,0.6988368437598239,0.8373821362799264,0.2336
27
+ CvT,b,0.8609082461227902,0.6444514303678088,0.8087320441988951,0.20520028109627547
28
+ CvT,c,0.8150388154171053,0.6516818610499843,0.8144069981583795,0.20857142857142857
29
+ CvT,d,0.046723183949472995,0.9833385727758567,0.9917753222836095,0.8463768115942029
30
+ CvT,e,1.055778265130245,0.5916575192096597,0.7447665178233557,0.4397590361445783
31
+ CvT,f,0.6479923738885578,0.7303074897374332,0.8562897926766284,0.07737148913619502
32
+ CvT,g,0.47873973870277403,0.8053333333333333,0.948009388888889,0.8337129840546698
33
+ CvT,h,0.45442124152183533,0.8091666666666667,0.9536063888888889,0.8364519354377946
34
+ CvT,i,0.0470859190672636,0.985,0.999245111111111,0.9848637739656912
35
+ CvT,j,3.978341913700104,0.31966666666666665,0.09071466666666667,0.006812652068126521
36
+ CvT,k,3.5466880963295697,0.49933333333333335,0.5394476666666667,0.009234828496042216
37
+ CvT,l,1.575411782030338,0.6541695309608164,0.5834616840778439,0.4856873230575653
38
+ Swin,a,0.7181912302146427,0.6636277900031436,0.8782845303867403,0.22351233671988388
39
+ Swin,b,0.47780195057523284,0.8000628733102798,0.9174990791896869,0.326271186440678
40
+ Swin,c,1.1387689553203542,0.5369380697893744,0.8261528545119704,0.1729365524985963
41
+ Swin,d,0.0309918804128743,0.9871109713926438,0.9966482504604052,0.8825214899713467
42
+ Swin,e,0.5848406514011806,0.7464324917672887,0.8943237720426852,0.5714285714285714
43
+ Swin,f,0.6053860638443279,0.7410735032143134,0.9040542417311522,0.0843604491920022
44
+ Swin,g,0.24402792798448353,0.8985,0.999229,0.9078529278256923
45
+ Swin,h,0.5944505902426317,0.759,0.9974725555555555,0.8058017727639001
46
+ Swin,i,0.007144115434028208,0.9976666666666667,0.9999902222222222,0.9976720984369803
47
+ Swin,j,3.9496381425857545,0.419,0.11956033333333334,0.06591639871382636
48
+ Swin,k,3.7127543271519245,0.5181666666666667,0.4697328333333333,0.07841887153331208
49
+ Swin,l,1.5895203038408208,0.6710908994764951,0.5955901653865907,0.5130734304055112
50
+ CaiT,a,0.11189156073487824,0.967934611757309,0.9444355432780847,0.7243243243243244
51
+ CaiT,b,0.14077727915177515,0.9566174159069475,0.9445782688766113,0.6600985221674877
52
+ CaiT,c,0.1453998264081071,0.9556743162527507,0.9236316758747698,0.6552567237163814
53
+ CaiT,d,0.0895651844381368,0.9757937755422823,0.9521049723756906,0.7768115942028986
54
+ CaiT,e,0.5067908598680527,0.8507135016465422,0.8893589646560206,0.6633663366336634
55
+ CaiT,f,0.09735689565035,0.968553946247386,0.9382155086735556,0.39762611275964393
56
+ CaiT,g,0.04386970533267595,0.9845,0.9998996666666666,0.984726556084743
57
+ CaiT,h,0.04632043000892736,0.984,0.999908,0.9842416283650689
58
+ CaiT,i,0.016718765972414985,0.9946666666666667,0.9999668888888888,0.9946914399469144
59
+ CaiT,j,4.091754978463054,0.5111666666666667,0.502447,0.09726069559864574
60
+ CaiT,k,4.064604013383389,0.5213333333333333,0.4420845,0.09912170639899624
61
+ CaiT,l,1.3512259357719865,0.8281423510126381,0.7136882113330858,0.669379450661241
62
+ DeiT,a,0.2868140126428901,0.9104055328513047,0.9108996316758747,0.5060658578856152
63
+ DeiT,b,0.21093111757876476,0.9380697893744105,0.9419907918968692,0.5971370143149284
64
+ DeiT,c,0.4344255732631803,0.8610499842816725,0.8816003683241251,0.3978201634877384
65
+ DeiT,d,0.10285686065156817,0.977051241747878,0.9568195211786372,0.8
66
+ DeiT,e,0.6603749339457532,0.8210757409440176,0.8680541890562324,0.6417582417582418
67
+ DeiT,f,0.23529704651809683,0.9209975989466347,0.9196866062244752,0.2225609756097561
68
+ DeiT,g,0.07966256512608379,0.9715,0.9994743333333334,0.9722086786933203
69
+ DeiT,h,0.1981518727680668,0.9306666666666666,0.9987041666666666,0.9349796811503595
70
+ DeiT,i,0.022365192129276693,0.9921666666666666,0.9998618888888888,0.9922043456626306
71
+ DeiT,j,4.922376271247864,0.4945,0.5104821111111111,0.07839562443026436
72
+ DeiT,k,4.865078900694847,0.5151666666666667,0.5016916666666666,0.08146510893590149
73
+ DeiT,l,1.6993422816543697,0.7937708212151657,0.7000134906492581,0.6261503067484663
74
+ DeiT3,a,0.3992725303153578,0.9091480666457089,0.9363876611418048,0.5126475548060708
75
+ DeiT3,b,0.3124380833470795,0.9305249921408362,0.952292817679558,0.579047619047619
76
+ DeiT3,c,0.4766911079170793,0.8915435397673688,0.9285046040515654,0.46841294298921415
77
+ DeiT3,d,0.11976070332006253,0.9723357434768941,0.9784143646408839,0.7755102040816326
78
+ DeiT3,e,0.9584465352031193,0.7969264544456641,0.8915991826231742,0.621676891615542
79
+ DeiT3,f,0.33781881941231834,0.921617225621563,0.9456139627538378,0.23100303951367782
80
+ DeiT3,g,0.1398290316515031,0.9671666666666666,0.9996258333333332,0.9681590431550025
81
+ DeiT3,h,0.22691049999523785,0.9465,0.9994123333333333,0.9491364284582475
82
+ DeiT3,i,0.03767789686251308,0.9893333333333333,0.9998997777777778,0.9894284770399736
83
+ DeiT3,j,6.353702080726624,0.486,0.2588749444444445,0.06545454545454546
84
+ DeiT3,k,6.251550879061222,0.5081666666666667,0.38331972222222227,0.06820334701610357
85
+ DeiT3,l,2.203556089656545,0.7932949077256624,0.6245051702293716,0.6248200403109704
86
+ Twins_SVT,a,0.42451784632109274,0.8126375353662371,0.8918922651933701,0.3377777777777778
87
+ Twins_SVT,b,0.3914457758111573,0.8142093681232316,0.8958011049723758,0.3396648044692737
88
+ Twins_SVT,c,0.44262768309920586,0.8047783715812638,0.8853812154696132,0.3286486486486486
89
+ Twins_SVT,d,0.07794447650992994,0.9820811065702609,0.9897808471454881,0.8421052631578947
90
+ Twins_SVT,e,0.5991019170841715,0.712403951701427,0.818761825474911,0.5371024734982333
91
+ Twins_SVT,f,0.3470329082674194,0.8442413445898846,0.9101541579685174,0.13131749460043196
92
+ Twins_SVT,g,0.2309408655166626,0.9008333333333334,0.9874507777777777,0.9088681268188084
93
+ Twins_SVT,h,0.2580758459568024,0.8958333333333334,0.9890552222222222,0.9047110840067083
94
+ Twins_SVT,i,0.06473293882608414,0.9898333333333333,0.9991408888888889,0.9898248540450375
95
+ Twins_SVT,j,2.941682351350784,0.41483333333333333,0.16385111111111111,0.02823138665928591
96
+ Twins_SVT,k,2.775474429190159,0.5038333333333334,0.41604216666666666,0.03312763884378045
97
+ Twins_SVT,l,1.1202095961319853,0.7359737718788008,0.5944821592359222,0.5594282184770141
98
+ Twins_PCPVT,a,0.4347801183106501,0.7944042753850991,0.8711482504604051,0.2952586206896552
99
+ Twins_PCPVT,b,0.339161620420017,0.8575919522162841,0.9065580110497238,0.3768913342503439
100
+ Twins_PCPVT,c,0.467156688444122,0.7742848160955674,0.8578029465930019,0.2762096774193548
101
+ Twins_PCPVT,d,0.23605295665344328,0.9245520276642565,0.9460681399631676,0.5330739299610895
102
+ Twins_PCPVT,e,0.4829401325446714,0.7694840834248079,0.8557481268447741,0.5661157024793388
103
+ Twins_PCPVT,f,0.3720619066015847,0.8374254511656727,0.8931208308558979,0.11546565528866413
104
+ Twins_PCPVT,g,0.2366939251422882,0.9141666666666667,0.9769166666666668,0.9182928764080597
105
+ Twins_PCPVT,h,0.3045526731014252,0.87,0.9672449999999999,0.8812423873325214
106
+ Twins_PCPVT,i,0.18202911043167114,0.9496666666666667,0.9892225555555555,0.9504105090311987
107
+ Twins_PCPVT,j,1.547880547761917,0.4866666666666667,0.43741700000000006,0.1760299625468165
108
+ Twins_PCPVT,k,1.4932157402038575,0.5221666666666667,0.47360899999999995,0.18666666666666668
109
+ Twins_PCPVT,l,0.7149772635186269,0.7421606472423458,0.6893006569431472,0.579510175922732
110
+ PiT,a,0.44727815774318297,0.8010059729644766,0.9252882136279927,0.3329820864067439
111
+ PiT,b,0.3666396583074745,0.8374724929267526,0.9387182320441989,0.3793517406962785
112
+ PiT,c,0.587441599189167,0.7419050613014775,0.9032854511970534,0.27792436235708
113
+ PiT,d,0.033625746908820454,0.9855391386356491,0.9966703499079189,0.8729281767955801
114
+ PiT,e,0.631099864955006,0.7387486278814489,0.8873495799591311,0.5703971119133574
115
+ PiT,f,0.3817344946660815,0.8324684377662458,0.9379145273921177,0.127470754336426
116
+ PiT,g,0.19083814918994904,0.9163333333333333,0.9971276666666666,0.9226025285229725
117
+ PiT,h,0.30789998161792753,0.8656666666666667,0.9943662222222222,0.8812960235640648
118
+ PiT,i,0.01428527906537056,0.9948333333333333,0.9999181111111111,0.9948462177888612
119
+ PiT,j,5.1374336256980895,0.427,0.18367044444444444,0.031549295774647886
120
+ PiT,k,4.960880736589432,0.5055,0.6351591666666667,0.03637544657356284
121
+ PiT,l,1.8333862431563666,0.7295225001321982,0.6306325748279862,0.5562592174893728
122
+ Ensemble,a,,0.9239232945614586,0.9640395948434622,0.56
123
+ Ensemble,b,,0.926752593524049,0.9696408839779006,0.5693160813308688
124
+ Ensemble,c,,0.8563344860106885,0.9442707182320442,0.40261437908496733
125
+ Ensemble,d,,0.9874253379440427,0.9976040515653776,0.8850574712643678
126
+ Ensemble,e,,0.8254665203073546,0.9312873685007189,0.6595289079229122
127
+ Ensemble,f,,0.9207652389435366,0.9667325628328263,0.23140495867768596
128
+ Ensemble,g,,0.9656666666666667,0.999739111111111,0.9668063164679342
129
+ Ensemble,h,,0.9283333333333333,0.9995118888888889,0.9331259720062208
130
+ Ensemble,i,,0.9978333333333333,0.9999951111111111,0.9978380176284717
131
+ Ensemble,j,,0.47883333333333333,0.20988066666666666,0.04809741248097413
132
+ Ensemble,k,,0.511,0.48139255555555555,0.05109961190168176
133
+ Ensemble,l,,0.79144413304426,0.6240045673759117,0.6211335254562921
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:74e1f64a38efd33254636ee2acbf341e532fe24ebd851c0f2646fc31f8e9d9c7
3
+ size 125239576
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0cd98b44ae73af3ae9975e9f116299918c40ddb0a3e4aa2de8be0b2071659493
3
+ size 125471131
roc_confusion_matrix/CvT_roc_confusion_matrix_a.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_b.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_c.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_d.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_e.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_f.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_g.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_h.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_i.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_j.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_k.png ADDED
roc_confusion_matrix/CvT_roc_confusion_matrix_l.png ADDED
roc_curves/CvT_ROC_a.png ADDED
roc_curves/CvT_ROC_b.png ADDED
roc_curves/CvT_ROC_c.png ADDED
roc_curves/CvT_ROC_d.png ADDED
roc_curves/CvT_ROC_e.png ADDED
roc_curves/CvT_ROC_f.png ADDED
roc_curves/CvT_ROC_g.png ADDED
roc_curves/CvT_ROC_h.png ADDED
roc_curves/CvT_ROC_i.png ADDED
roc_curves/CvT_ROC_j.png ADDED
roc_curves/CvT_ROC_k.png ADDED
roc_curves/CvT_ROC_l.png ADDED
training_curves/CvT_accuracy.png ADDED
training_curves/CvT_auc.png ADDED
training_curves/CvT_combined_metrics.png ADDED

Git LFS Details

  • SHA256: 0746a4bb1e49600ffa7ff7141830512f83f0cb67da9dcccf5868eed8518b85ed
  • Pointer size: 131 Bytes
  • Size of remote file: 179 kB
training_curves/CvT_f1.png ADDED
training_curves/CvT_loss.png ADDED
training_curves/CvT_metrics.csv ADDED
@@ -0,0 +1,85 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
2
+ 1,0.5113886439323425,0.4467386894226074,0.710175,0.778,0.8112764884375001,0.900808,0.7105006867274316,0.8028419182948491
3
+ 2,0.41086628541946413,0.34807376861572265,0.7780125,0.85,0.8843476721875,0.931412,0.7791910676762779,0.8584905660377359
4
+ 3,0.383760599732399,0.3299612154960632,0.791125,0.864,0.8991345040625001,0.9396399999999999,0.7910936640496075,0.870722433460076
5
+ 4,0.37204701553583147,0.31665582752227783,0.7965625,0.862,0.9050560440625,0.9442999999999999,0.7973503007060054,0.867816091954023
6
+ 5,0.3590931899309158,0.3303352015018463,0.8048125,0.859,0.9116219903125,0.9508979999999999,0.8062823327998809,0.8705234159779615
7
+ 6,0.3553567500591278,0.3238047833442688,0.80665,0.86,0.913331784375,0.9495199999999999,0.8090888894374367,0.8689138576779026
8
+ 7,0.3452604936957359,0.27979736733436583,0.8102,0.878,0.9177844987499999,0.953538,0.8100052553867715,0.8801571709233792
9
+ 8,0.3442600884795189,0.26511841917037965,0.8132625,0.897,0.918718833125,0.963936,0.8161783705964144,0.9018112488083889
10
+ 9,0.33971816600561144,0.26179747796058656,0.8160875,0.896,0.9212521184375,0.96306,0.8185126250477988,0.8998073217726397
11
+ 10,0.33541087886095045,0.25983864450454713,0.81575,0.892,0.9225681390625,0.9627720000000001,0.8175064999380959,0.8959537572254336
12
+ 11,0.33474508765935895,0.25052296960353854,0.818125,0.895,0.9231618746875,0.9628439999999999,0.8215555937109078,0.8969578017664377
13
+ 12,0.33274546703100205,0.24129684329032897,0.816125,0.909,0.923392115,0.9687640000000001,0.8189894913001747,0.9120772946859903
14
+ 13,0.3301350746512413,0.2579976317882538,0.8211,0.905,0.9254275390625,0.963674,0.8215283320031923,0.9078564500484966
15
+ 14,0.323131390953064,0.2274529696702957,0.8241,0.913,0.9282665303125,0.969628,0.8264558616777249,0.9137760158572844
16
+ 15,0.32429987013339995,0.25822671031951905,0.821375,0.895,0.927188639375,0.961548,0.8258676155195944,0.896551724137931
17
+ 16,0.32402454870939257,0.2632750120162964,0.8229125,0.896,0.9279407484375001,0.9618000000000001,0.8240845367737449,0.8990291262135922
18
+ 17,0.31963467900753023,0.26520303916931154,0.8249875,0.884,0.9298153743750001,0.958802,0.8269580155973848,0.8869395711500975
19
+ 18,0.31986406939029693,0.23152883648872374,0.825825,0.902,0.9296690750000001,0.968108,0.8264713940571372,0.903353057199211
20
+ 19,0.3165740448594093,0.2404348633289337,0.82685,0.9,0.9310080765625001,0.9683639999999999,0.8290594071624257,0.9034749034749034
21
+ 20,0.31792913516759874,0.23889201140403749,0.826075,0.904,0.930247801875,0.970868,0.8270564546200313,0.9080459770114943
22
+ 21,0.3103618373155594,0.23661772012710572,0.8305625,0.905,0.9334291596875,0.9704879999999999,0.8297303068748508,0.9082125603864735
23
+ 22,0.3114424193620682,0.24792745018005372,0.8273,0.896,0.9326040743749999,0.968236,0.8268758458222646,0.8998073217726397
24
+ 23,0.30957031584978106,0.2510049114227295,0.83125,0.892,0.9339269946875,0.9684740000000001,0.8309075878654275,0.8969465648854962
25
+ 24,0.30921384286880493,0.22247469800710679,0.8303625,0.916,0.933805045,0.972032,0.8332616628374144,0.91796875
26
+ 25,0.30654513928890226,0.23475138401985168,0.8310875,0.903,0.9349835209374999,0.968966,0.8323968992248062,0.9053658536585366
27
+ 26,0.30617363412380216,0.2248240877389908,0.83205,0.909,0.9353177175,0.9708199999999999,0.831709211152584,0.911219512195122
28
+ 27,0.30596079412698746,0.220391073346138,0.8321125,0.916,0.9356234659375,0.972506,0.8325833593019633,0.9181286549707602
29
+ 28,0.304587315928936,0.24013106894493103,0.8317625,0.902,0.9358144490625,0.970964,0.8290725289239406,0.9057692307692308
30
+ 29,0.3056128895163536,0.2235409462451935,0.8307,0.902,0.935246413125,0.9722200000000001,0.83092824686673,0.9046692607003891
31
+ 30,0.3029156920909882,0.25133012151718137,0.832925,0.896,0.9365160328125001,0.967198,0.8364455103888794,0.8996138996138996
32
+ 31,0.3036591767072678,0.22185246026515962,0.832675,0.901,0.93605763625,0.972906,0.8367560975609756,0.9036027263875365
33
+ 32,0.3062893540859222,0.2429929120540619,0.831425,0.902,0.935106978125,0.97146,0.8343731577913146,0.9061302681992337
34
+ 33,0.30293621964454653,0.22243023312091828,0.8339125,0.917,0.9368525046875,0.972802,0.8352694677593325,0.9196515004840271
35
+ 34,0.30469499473571776,0.2470554381608963,0.8328875,0.898,0.9357753428125001,0.96848,0.8319104796630414,0.9017341040462428
36
+ 35,0.3041131570100784,0.2533268423080444,0.8327875,0.895,0.9358014484375,0.969006,0.8334142787761049,0.8993288590604027
37
+ 36,0.3061536881327629,0.23192279720306397,0.831575,0.906,0.93507893125,0.971832,0.8319824425767514,0.9094412331406551
38
+ 37,0.30523560807704925,0.22442477118968965,0.8315625,0.91,0.9355373184375,0.9735259999999999,0.8325858191802606,0.9136276391554703
39
+ 38,0.3041664883971214,0.23788594770431518,0.8321375,0.9,0.9358716956249999,0.968984,0.8337398323655768,0.9029126213592233
40
+ 39,0.3034518471837044,0.22903583812713624,0.8338125,0.906,0.93634410125,0.9718220000000001,0.8362220826096062,0.9089147286821705
41
+ 40,0.3046425341248512,0.22394718778133393,0.832225,0.907,0.9356764315624999,0.9718400000000001,0.8342267124472,0.9092682926829269
42
+ 41,0.30207886862754824,0.24307810962200166,0.8341625,0.901,0.9370867003125,0.9681599999999999,0.8373643885994484,0.903976721629486
43
+ 42,0.3060490905404091,0.24345741152763367,0.8328375,0.893,0.9354049706249999,0.9691320000000001,0.8367254746352482,0.8966183574879227
44
+ 43,0.30383523888587954,0.21429471111297607,0.833225,0.907,0.936173700625,0.974174,0.8363346418056918,0.9099709583736689
45
+ 44,0.3025417940139771,0.24102875301241874,0.8329125,0.902,0.9367421778125,0.9679599999999998,0.8363351413563846,0.9044834307992202
46
+ 45,0.3033503027439117,0.21941902053356171,0.8309,0.908,0.9358563703125,0.973862,0.8336612236867992,0.9111969111969112
47
+ 46,0.30413150579929354,0.22507753300666808,0.832,0.908,0.9359792590625001,0.9733619999999998,0.8330642156253881,0.9108527131782945
48
+ 47,0.30789772633314133,0.2130628733634949,0.8305625,0.916,0.9344729425,0.9749960000000001,0.8321258282246579,0.9181286549707602
49
+ 48,0.30611159081459044,0.22083904254436493,0.8310125,0.912,0.9350651853124999,0.97367,0.8317046147717512,0.9147286821705426
50
+ 49,0.3030417980790138,0.23051351594924926,0.8349375,0.904,0.936951865,0.9714360000000001,0.8363307346215342,0.9069767441860465
51
+ 50,0.30296236768960955,0.22108420300483703,0.8324875,0.907,0.936323050625,0.9731599999999999,0.8350625853856662,0.9096209912536443
52
+ 51,0.3027757642865181,0.2364919991493225,0.8322,0.905,0.9362442653124999,0.9699580000000001,0.8354942280826451,0.9078564500484966
53
+ 52,0.3014978398799896,0.26365012884140016,0.8341875,0.894,0.9368652306249999,0.9669540000000001,0.8384149683895095,0.8992395437262357
54
+ 53,0.3024936760544777,0.21776628732681275,0.832,0.908,0.9365269325000001,0.971616,0.8363689490600954,0.9087301587301587
55
+ 54,0.304066121840477,0.22379997646808625,0.832675,0.903,0.9361766278125,0.972468,0.8360924719596414,0.9057337220602527
56
+ 55,0.30257794806957244,0.21681742715835572,0.8345875,0.91,0.936934685,0.97378,0.8368813559322034,0.9127906976744186
57
+ 56,0.3061160779476166,0.25061603808403016,0.83165,0.888,0.93529230625,0.9693900000000001,0.8346510828463389,0.89272030651341
58
+ 57,0.3052486520886421,0.2177787778377533,0.832925,0.912,0.9357276546874999,0.972484,0.8350650312199215,0.9137254901960784
59
+ 58,0.3062249651670456,0.23781948840618133,0.830925,0.897,0.934719448125,0.969358,0.8336858154633091,0.8999028182701652
60
+ 59,0.3045389884233475,0.24729155039787293,0.8326875,0.9,0.9357657553125001,0.970012,0.8355994448333885,0.9042145593869731
61
+ 60,0.3006735630750656,0.21392891335487366,0.833825,0.907,0.937098893125,0.9732660000000001,0.8355679794181674,0.9085545722713865
62
+ 61,0.30177775864601136,0.22752880692481994,0.8334,0.9,0.9367513209375,0.9717800000000001,0.8360437938245787,0.9029126213592233
63
+ 62,0.30314488623142244,0.21589253091812133,0.8337625,0.914,0.9364310568750001,0.974342,0.8363220144982831,0.9166666666666666
64
+ 63,0.3006242594361305,0.21348099899291992,0.8339,0.912,0.937338966875,0.9748680000000001,0.835906048556398,0.914396887159533
65
+ 64,0.3028922115802765,0.19859643387794496,0.8335,0.916,0.9367036909375,0.976148,0.8359929078014184,0.9168316831683169
66
+ 65,0.3027864682674408,0.22869542264938356,0.8342875,0.908,0.936913334375,0.973582,0.8364201720074529,0.9117082533589251
67
+ 66,0.3057905566692352,0.2093310366868973,0.8320125,0.918,0.935329254375,0.975524,0.8331823090577326,0.9200779727095516
68
+ 67,0.3062150678157806,0.2270612667798996,0.83185,0.901,0.935010686875,0.9709839999999998,0.8338356638173823,0.9030362389813908
69
+ 68,0.3039026636362076,0.2102983570098877,0.8321875,0.913,0.9359528021875001,0.975646,0.8347875312273102,0.9154518950437318
70
+ 69,0.30292134199142456,0.24109376418590545,0.8337375,0.901,0.9366356928124999,0.9685739999999999,0.8359217911552458,0.9043478260869565
71
+ 70,0.3028713608264923,0.2014880656003952,0.8321,0.921,0.93604536875,0.9758139999999998,0.8353800524548374,0.9220138203356367
72
+ 71,0.3022945895791054,0.2295145403146744,0.833475,0.904,0.936548405,0.9719800000000001,0.8375048790007806,0.9075144508670521
73
+ 72,0.3034215456366539,0.22526203274726866,0.8327375,0.907,0.936401005,0.9710859999999999,0.8358904546402247,0.9090909090909091
74
+ 73,0.30472388269901274,0.22892201817035676,0.8312875,0.903,0.9352636790625,0.97035,0.8340811584938597,0.9053658536585366
75
+ 74,0.3023205008864403,0.221133474111557,0.8336,0.901,0.9369013584375,0.9735699999999999,0.8354674445048698,0.9043478260869565
76
+ 75,0.3039404956102371,0.21915315127372742,0.8327,0.904,0.93604840875,0.9725799999999999,0.8357267349092962,0.90625
77
+ 76,0.3054527995944023,0.2209102919101715,0.8308875,0.909,0.9353886190625,0.972136,0.834197335686361,0.9110459433040078
78
+ 77,0.3065829254746437,0.21015667402744292,0.8296625,0.91,0.9346212246875,0.974342,0.8318629930780905,0.9117647058823529
79
+ 78,0.3022213262915611,0.21816637706756592,0.8336875,0.909,0.9365773715625,0.973156,0.8352260765105824,0.911219512195122
80
+ 79,0.30378366285562514,0.22191380941867828,0.833525,0.898,0.9363927190625,0.9726919999999999,0.8353851478295263,0.9011627906976745
81
+ 80,0.3026993294596672,0.2309602391719818,0.8319625,0.903,0.9363193887500001,0.9707359999999999,0.8340841489453612,0.9059165858389913
82
+ 81,0.3042393825173378,0.21906527280807495,0.8329375,0.91,0.9357335034375001,0.972632,0.8351932918182379,0.9122807017543859
83
+ 82,0.30104678318500516,0.25912450671195986,0.83455,0.894,0.937488745625,0.96786,0.8360826274334968,0.8990476190476191
84
+ 83,0.3030637550234795,0.24533470451831818,0.8341125,0.891,0.9367967803125,0.970632,0.834462198605446,0.8960915157292659
85
+ 84,0.3026913456439972,0.21429330670833588,0.8341875,0.913,0.9366150003125,0.9737359999999999,0.835546298706934,0.9149560117302052
training_metrics.csv ADDED
@@ -0,0 +1,85 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
2
+ 1,0.5113886439323425,0.4467386894226074,0.710175,0.778,0.8112764884375001,0.900808,0.7105006867274316,0.8028419182948491
3
+ 2,0.41086628541946413,0.34807376861572265,0.7780125,0.85,0.8843476721875,0.931412,0.7791910676762779,0.8584905660377359
4
+ 3,0.383760599732399,0.3299612154960632,0.791125,0.864,0.8991345040625001,0.9396399999999999,0.7910936640496075,0.870722433460076
5
+ 4,0.37204701553583147,0.31665582752227783,0.7965625,0.862,0.9050560440625,0.9442999999999999,0.7973503007060054,0.867816091954023
6
+ 5,0.3590931899309158,0.3303352015018463,0.8048125,0.859,0.9116219903125,0.9508979999999999,0.8062823327998809,0.8705234159779615
7
+ 6,0.3553567500591278,0.3238047833442688,0.80665,0.86,0.913331784375,0.9495199999999999,0.8090888894374367,0.8689138576779026
8
+ 7,0.3452604936957359,0.27979736733436583,0.8102,0.878,0.9177844987499999,0.953538,0.8100052553867715,0.8801571709233792
9
+ 8,0.3442600884795189,0.26511841917037965,0.8132625,0.897,0.918718833125,0.963936,0.8161783705964144,0.9018112488083889
10
+ 9,0.33971816600561144,0.26179747796058656,0.8160875,0.896,0.9212521184375,0.96306,0.8185126250477988,0.8998073217726397
11
+ 10,0.33541087886095045,0.25983864450454713,0.81575,0.892,0.9225681390625,0.9627720000000001,0.8175064999380959,0.8959537572254336
12
+ 11,0.33474508765935895,0.25052296960353854,0.818125,0.895,0.9231618746875,0.9628439999999999,0.8215555937109078,0.8969578017664377
13
+ 12,0.33274546703100205,0.24129684329032897,0.816125,0.909,0.923392115,0.9687640000000001,0.8189894913001747,0.9120772946859903
14
+ 13,0.3301350746512413,0.2579976317882538,0.8211,0.905,0.9254275390625,0.963674,0.8215283320031923,0.9078564500484966
15
+ 14,0.323131390953064,0.2274529696702957,0.8241,0.913,0.9282665303125,0.969628,0.8264558616777249,0.9137760158572844
16
+ 15,0.32429987013339995,0.25822671031951905,0.821375,0.895,0.927188639375,0.961548,0.8258676155195944,0.896551724137931
17
+ 16,0.32402454870939257,0.2632750120162964,0.8229125,0.896,0.9279407484375001,0.9618000000000001,0.8240845367737449,0.8990291262135922
18
+ 17,0.31963467900753023,0.26520303916931154,0.8249875,0.884,0.9298153743750001,0.958802,0.8269580155973848,0.8869395711500975
19
+ 18,0.31986406939029693,0.23152883648872374,0.825825,0.902,0.9296690750000001,0.968108,0.8264713940571372,0.903353057199211
20
+ 19,0.3165740448594093,0.2404348633289337,0.82685,0.9,0.9310080765625001,0.9683639999999999,0.8290594071624257,0.9034749034749034
21
+ 20,0.31792913516759874,0.23889201140403749,0.826075,0.904,0.930247801875,0.970868,0.8270564546200313,0.9080459770114943
22
+ 21,0.3103618373155594,0.23661772012710572,0.8305625,0.905,0.9334291596875,0.9704879999999999,0.8297303068748508,0.9082125603864735
23
+ 22,0.3114424193620682,0.24792745018005372,0.8273,0.896,0.9326040743749999,0.968236,0.8268758458222646,0.8998073217726397
24
+ 23,0.30957031584978106,0.2510049114227295,0.83125,0.892,0.9339269946875,0.9684740000000001,0.8309075878654275,0.8969465648854962
25
+ 24,0.30921384286880493,0.22247469800710679,0.8303625,0.916,0.933805045,0.972032,0.8332616628374144,0.91796875
26
+ 25,0.30654513928890226,0.23475138401985168,0.8310875,0.903,0.9349835209374999,0.968966,0.8323968992248062,0.9053658536585366
27
+ 26,0.30617363412380216,0.2248240877389908,0.83205,0.909,0.9353177175,0.9708199999999999,0.831709211152584,0.911219512195122
28
+ 27,0.30596079412698746,0.220391073346138,0.8321125,0.916,0.9356234659375,0.972506,0.8325833593019633,0.9181286549707602
29
+ 28,0.304587315928936,0.24013106894493103,0.8317625,0.902,0.9358144490625,0.970964,0.8290725289239406,0.9057692307692308
30
+ 29,0.3056128895163536,0.2235409462451935,0.8307,0.902,0.935246413125,0.9722200000000001,0.83092824686673,0.9046692607003891
31
+ 30,0.3029156920909882,0.25133012151718137,0.832925,0.896,0.9365160328125001,0.967198,0.8364455103888794,0.8996138996138996
32
+ 31,0.3036591767072678,0.22185246026515962,0.832675,0.901,0.93605763625,0.972906,0.8367560975609756,0.9036027263875365
33
+ 32,0.3062893540859222,0.2429929120540619,0.831425,0.902,0.935106978125,0.97146,0.8343731577913146,0.9061302681992337
34
+ 33,0.30293621964454653,0.22243023312091828,0.8339125,0.917,0.9368525046875,0.972802,0.8352694677593325,0.9196515004840271
35
+ 34,0.30469499473571776,0.2470554381608963,0.8328875,0.898,0.9357753428125001,0.96848,0.8319104796630414,0.9017341040462428
36
+ 35,0.3041131570100784,0.2533268423080444,0.8327875,0.895,0.9358014484375,0.969006,0.8334142787761049,0.8993288590604027
37
+ 36,0.3061536881327629,0.23192279720306397,0.831575,0.906,0.93507893125,0.971832,0.8319824425767514,0.9094412331406551
38
+ 37,0.30523560807704925,0.22442477118968965,0.8315625,0.91,0.9355373184375,0.9735259999999999,0.8325858191802606,0.9136276391554703
39
+ 38,0.3041664883971214,0.23788594770431518,0.8321375,0.9,0.9358716956249999,0.968984,0.8337398323655768,0.9029126213592233
40
+ 39,0.3034518471837044,0.22903583812713624,0.8338125,0.906,0.93634410125,0.9718220000000001,0.8362220826096062,0.9089147286821705
41
+ 40,0.3046425341248512,0.22394718778133393,0.832225,0.907,0.9356764315624999,0.9718400000000001,0.8342267124472,0.9092682926829269
42
+ 41,0.30207886862754824,0.24307810962200166,0.8341625,0.901,0.9370867003125,0.9681599999999999,0.8373643885994484,0.903976721629486
43
+ 42,0.3060490905404091,0.24345741152763367,0.8328375,0.893,0.9354049706249999,0.9691320000000001,0.8367254746352482,0.8966183574879227
44
+ 43,0.30383523888587954,0.21429471111297607,0.833225,0.907,0.936173700625,0.974174,0.8363346418056918,0.9099709583736689
45
+ 44,0.3025417940139771,0.24102875301241874,0.8329125,0.902,0.9367421778125,0.9679599999999998,0.8363351413563846,0.9044834307992202
46
+ 45,0.3033503027439117,0.21941902053356171,0.8309,0.908,0.9358563703125,0.973862,0.8336612236867992,0.9111969111969112
47
+ 46,0.30413150579929354,0.22507753300666808,0.832,0.908,0.9359792590625001,0.9733619999999998,0.8330642156253881,0.9108527131782945
48
+ 47,0.30789772633314133,0.2130628733634949,0.8305625,0.916,0.9344729425,0.9749960000000001,0.8321258282246579,0.9181286549707602
49
+ 48,0.30611159081459044,0.22083904254436493,0.8310125,0.912,0.9350651853124999,0.97367,0.8317046147717512,0.9147286821705426
50
+ 49,0.3030417980790138,0.23051351594924926,0.8349375,0.904,0.936951865,0.9714360000000001,0.8363307346215342,0.9069767441860465
51
+ 50,0.30296236768960955,0.22108420300483703,0.8324875,0.907,0.936323050625,0.9731599999999999,0.8350625853856662,0.9096209912536443
52
+ 51,0.3027757642865181,0.2364919991493225,0.8322,0.905,0.9362442653124999,0.9699580000000001,0.8354942280826451,0.9078564500484966
53
+ 52,0.3014978398799896,0.26365012884140016,0.8341875,0.894,0.9368652306249999,0.9669540000000001,0.8384149683895095,0.8992395437262357
54
+ 53,0.3024936760544777,0.21776628732681275,0.832,0.908,0.9365269325000001,0.971616,0.8363689490600954,0.9087301587301587
55
+ 54,0.304066121840477,0.22379997646808625,0.832675,0.903,0.9361766278125,0.972468,0.8360924719596414,0.9057337220602527
56
+ 55,0.30257794806957244,0.21681742715835572,0.8345875,0.91,0.936934685,0.97378,0.8368813559322034,0.9127906976744186
57
+ 56,0.3061160779476166,0.25061603808403016,0.83165,0.888,0.93529230625,0.9693900000000001,0.8346510828463389,0.89272030651341
58
+ 57,0.3052486520886421,0.2177787778377533,0.832925,0.912,0.9357276546874999,0.972484,0.8350650312199215,0.9137254901960784
59
+ 58,0.3062249651670456,0.23781948840618133,0.830925,0.897,0.934719448125,0.969358,0.8336858154633091,0.8999028182701652
60
+ 59,0.3045389884233475,0.24729155039787293,0.8326875,0.9,0.9357657553125001,0.970012,0.8355994448333885,0.9042145593869731
61
+ 60,0.3006735630750656,0.21392891335487366,0.833825,0.907,0.937098893125,0.9732660000000001,0.8355679794181674,0.9085545722713865
62
+ 61,0.30177775864601136,0.22752880692481994,0.8334,0.9,0.9367513209375,0.9717800000000001,0.8360437938245787,0.9029126213592233
63
+ 62,0.30314488623142244,0.21589253091812133,0.8337625,0.914,0.9364310568750001,0.974342,0.8363220144982831,0.9166666666666666
64
+ 63,0.3006242594361305,0.21348099899291992,0.8339,0.912,0.937338966875,0.9748680000000001,0.835906048556398,0.914396887159533
65
+ 64,0.3028922115802765,0.19859643387794496,0.8335,0.916,0.9367036909375,0.976148,0.8359929078014184,0.9168316831683169
66
+ 65,0.3027864682674408,0.22869542264938356,0.8342875,0.908,0.936913334375,0.973582,0.8364201720074529,0.9117082533589251
67
+ 66,0.3057905566692352,0.2093310366868973,0.8320125,0.918,0.935329254375,0.975524,0.8331823090577326,0.9200779727095516
68
+ 67,0.3062150678157806,0.2270612667798996,0.83185,0.901,0.935010686875,0.9709839999999998,0.8338356638173823,0.9030362389813908
69
+ 68,0.3039026636362076,0.2102983570098877,0.8321875,0.913,0.9359528021875001,0.975646,0.8347875312273102,0.9154518950437318
70
+ 69,0.30292134199142456,0.24109376418590545,0.8337375,0.901,0.9366356928124999,0.9685739999999999,0.8359217911552458,0.9043478260869565
71
+ 70,0.3028713608264923,0.2014880656003952,0.8321,0.921,0.93604536875,0.9758139999999998,0.8353800524548374,0.9220138203356367
72
+ 71,0.3022945895791054,0.2295145403146744,0.833475,0.904,0.936548405,0.9719800000000001,0.8375048790007806,0.9075144508670521
73
+ 72,0.3034215456366539,0.22526203274726866,0.8327375,0.907,0.936401005,0.9710859999999999,0.8358904546402247,0.9090909090909091
74
+ 73,0.30472388269901274,0.22892201817035676,0.8312875,0.903,0.9352636790625,0.97035,0.8340811584938597,0.9053658536585366
75
+ 74,0.3023205008864403,0.221133474111557,0.8336,0.901,0.9369013584375,0.9735699999999999,0.8354674445048698,0.9043478260869565
76
+ 75,0.3039404956102371,0.21915315127372742,0.8327,0.904,0.93604840875,0.9725799999999999,0.8357267349092962,0.90625
77
+ 76,0.3054527995944023,0.2209102919101715,0.8308875,0.909,0.9353886190625,0.972136,0.834197335686361,0.9110459433040078
78
+ 77,0.3065829254746437,0.21015667402744292,0.8296625,0.91,0.9346212246875,0.974342,0.8318629930780905,0.9117647058823529
79
+ 78,0.3022213262915611,0.21816637706756592,0.8336875,0.909,0.9365773715625,0.973156,0.8352260765105824,0.911219512195122
80
+ 79,0.30378366285562514,0.22191380941867828,0.833525,0.898,0.9363927190625,0.9726919999999999,0.8353851478295263,0.9011627906976745
81
+ 80,0.3026993294596672,0.2309602391719818,0.8319625,0.903,0.9363193887500001,0.9707359999999999,0.8340841489453612,0.9059165858389913
82
+ 81,0.3042393825173378,0.21906527280807495,0.8329375,0.91,0.9357335034375001,0.972632,0.8351932918182379,0.9122807017543859
83
+ 82,0.30104678318500516,0.25912450671195986,0.83455,0.894,0.937488745625,0.96786,0.8360826274334968,0.8990476190476191
84
+ 83,0.3030637550234795,0.24533470451831818,0.8341125,0.891,0.9367967803125,0.970632,0.834462198605446,0.8960915157292659
85
+ 84,0.3026913456439972,0.21429330670833588,0.8341875,0.913,0.9366150003125,0.9737359999999999,0.835546298706934,0.9149560117302052