Upload MLP-Mixer model from experiment c2
Browse filesThis view is limited to 50 files because it contains too many changes.
See raw diff
- .gitattributes +2 -0
- README.md +166 -0
- config.json +76 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_a.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_b.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_c.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_d.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_e.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_f.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_g.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_h.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_i.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_j.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_k.png +0 -0
- confusion_matrices/MLP-Mixer_Confusion_Matrix_l.png +0 -0
- evaluation_results.csv +133 -0
- mlp-mixer-gravit-c2.pth +3 -0
- model.safetensors +3 -0
- pytorch_model.bin +3 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_a.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_b.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_c.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_d.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_e.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_f.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_g.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_h.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_i.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_j.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_k.png +0 -0
- roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_l.png +0 -0
- roc_curves/MLP-Mixer_ROC_a.png +0 -0
- roc_curves/MLP-Mixer_ROC_b.png +0 -0
- roc_curves/MLP-Mixer_ROC_c.png +0 -0
- roc_curves/MLP-Mixer_ROC_d.png +0 -0
- roc_curves/MLP-Mixer_ROC_e.png +0 -0
- roc_curves/MLP-Mixer_ROC_f.png +0 -0
- roc_curves/MLP-Mixer_ROC_g.png +0 -0
- roc_curves/MLP-Mixer_ROC_h.png +0 -0
- roc_curves/MLP-Mixer_ROC_i.png +0 -0
- roc_curves/MLP-Mixer_ROC_j.png +0 -0
- roc_curves/MLP-Mixer_ROC_k.png +0 -0
- roc_curves/MLP-Mixer_ROC_l.png +0 -0
- training_curves/MLP-Mixer_accuracy.png +0 -0
- training_curves/MLP-Mixer_auc.png +0 -0
- training_curves/MLP-Mixer_combined_metrics.png +3 -0
- training_curves/MLP-Mixer_f1.png +0 -0
- training_curves/MLP-Mixer_loss.png +0 -0
- training_curves/MLP-Mixer_metrics.csv +54 -0
- training_metrics.csv +54 -0
.gitattributes
CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
+
training_curves/MLP-Mixer_combined_metrics.png filter=lfs diff=lfs merge=lfs -text
|
37 |
+
training_notebook_c2.ipynb filter=lfs diff=lfs merge=lfs -text
|
README.md
ADDED
@@ -0,0 +1,166 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
tags:
|
4 |
+
- image-classification
|
5 |
+
- pytorch
|
6 |
+
- timm
|
7 |
+
- mlp-mixer
|
8 |
+
- vision-transformer
|
9 |
+
- transformer
|
10 |
+
- gravitational-lensing
|
11 |
+
- strong-lensing
|
12 |
+
- astronomy
|
13 |
+
- astrophysics
|
14 |
+
datasets:
|
15 |
+
- parlange/gravit-c21-j24
|
16 |
+
metrics:
|
17 |
+
- accuracy
|
18 |
+
- auc
|
19 |
+
- f1
|
20 |
+
paper:
|
21 |
+
- title: "GraViT: A Gravitational Lens Discovery Toolkit with Vision Transformers"
|
22 |
+
url: "https://arxiv.org/abs/2509.00226"
|
23 |
+
authors: "Parlange et al."
|
24 |
+
model-index:
|
25 |
+
- name: MLP-Mixer-c2
|
26 |
+
results:
|
27 |
+
- task:
|
28 |
+
type: image-classification
|
29 |
+
name: Strong Gravitational Lens Discovery
|
30 |
+
dataset:
|
31 |
+
type: common-test-sample
|
32 |
+
name: Common Test Sample (More et al. 2024)
|
33 |
+
metrics:
|
34 |
+
- type: accuracy
|
35 |
+
value: 0.7143
|
36 |
+
name: Average Accuracy
|
37 |
+
- type: auc
|
38 |
+
value: 0.8680
|
39 |
+
name: Average AUC-ROC
|
40 |
+
- type: f1
|
41 |
+
value: 0.5146
|
42 |
+
name: Average F1-Score
|
43 |
+
---
|
44 |
+
|
45 |
+
# 🌌 mlp-mixer-gravit-c2
|
46 |
+
|
47 |
+
🔭 This model is part of **GraViT**: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery
|
48 |
+
|
49 |
+
🔗 **GitHub Repository**: [https://github.com/parlange/gravit](https://github.com/parlange/gravit)
|
50 |
+
|
51 |
+
## 🛰️ Model Details
|
52 |
+
|
53 |
+
- **🤖 Model Type**: MLP-Mixer
|
54 |
+
- **🧪 Experiment**: C2 - C21+J24-half
|
55 |
+
- **🌌 Dataset**: C21+J24
|
56 |
+
- **🪐 Fine-tuning Strategy**: half
|
57 |
+
|
58 |
+
|
59 |
+
|
60 |
+
## 💻 Quick Start
|
61 |
+
|
62 |
+
```python
|
63 |
+
import torch
|
64 |
+
import timm
|
65 |
+
|
66 |
+
# Load the model directly from the Hub
|
67 |
+
model = timm.create_model(
|
68 |
+
'hf-hub:parlange/mlp-mixer-gravit-c2',
|
69 |
+
pretrained=True
|
70 |
+
)
|
71 |
+
model.eval()
|
72 |
+
|
73 |
+
# Example inference
|
74 |
+
dummy_input = torch.randn(1, 3, 224, 224)
|
75 |
+
with torch.no_grad():
|
76 |
+
output = model(dummy_input)
|
77 |
+
predictions = torch.softmax(output, dim=1)
|
78 |
+
print(f"Lens probability: {predictions[0][1]:.4f}")
|
79 |
+
```
|
80 |
+
|
81 |
+
## ⚡️ Training Configuration
|
82 |
+
|
83 |
+
**Training Dataset:** C21+J24 (Cañameras et al. 2021 + Jaelani et al. 2024)
|
84 |
+
**Fine-tuning Strategy:** half
|
85 |
+
|
86 |
+
|
87 |
+
| 🔧 Parameter | 📝 Value |
|
88 |
+
|--------------|----------|
|
89 |
+
| Batch Size | 192 |
|
90 |
+
| Learning Rate | AdamW with ReduceLROnPlateau |
|
91 |
+
| Epochs | 100 |
|
92 |
+
| Patience | 10 |
|
93 |
+
| Optimizer | AdamW |
|
94 |
+
| Scheduler | ReduceLROnPlateau |
|
95 |
+
| Image Size | 224x224 |
|
96 |
+
| Fine Tune Mode | half |
|
97 |
+
| Stochastic Depth Probability | 0.1 |
|
98 |
+
|
99 |
+
|
100 |
+
## 📈 Training Curves
|
101 |
+
|
102 |
+

|
103 |
+
|
104 |
+
|
105 |
+
## 🏁 Final Epoch Training Metrics
|
106 |
+
|
107 |
+
| Metric | Training | Validation |
|
108 |
+
|:---------:|:-----------:|:-------------:|
|
109 |
+
| 📉 Loss | 0.0650 | 0.0401 |
|
110 |
+
| 🎯 Accuracy | 0.9743 | 0.9854 |
|
111 |
+
| 📊 AUC-ROC | 0.9973 | 0.9991 |
|
112 |
+
| ⚖️ F1 Score | 0.9742 | 0.9855 |
|
113 |
+
|
114 |
+
|
115 |
+
## ☑️ Evaluation Results
|
116 |
+
|
117 |
+
### ROC Curves and Confusion Matrices
|
118 |
+
|
119 |
+
Performance across all test datasets (a through l) in the Common Test Sample (More et al. 2024):
|
120 |
+
|
121 |
+

|
122 |
+

|
123 |
+

|
124 |
+

|
125 |
+

|
126 |
+

|
127 |
+

|
128 |
+

|
129 |
+

|
130 |
+

|
131 |
+

|
132 |
+

|
133 |
+
|
134 |
+
### 📋 Performance Summary
|
135 |
+
|
136 |
+
Average performance across 12 test datasets from the Common Test Sample (More et al. 2024):
|
137 |
+
|
138 |
+
| Metric | Value |
|
139 |
+
|-----------|----------|
|
140 |
+
| 🎯 Average Accuracy | 0.7143 |
|
141 |
+
| 📈 Average AUC-ROC | 0.8680 |
|
142 |
+
| ⚖�� Average F1-Score | 0.5146 |
|
143 |
+
|
144 |
+
|
145 |
+
## 📘 Citation
|
146 |
+
|
147 |
+
If you use this model in your research, please cite:
|
148 |
+
|
149 |
+
```bibtex
|
150 |
+
@misc{parlange2025gravit,
|
151 |
+
title={GraViT: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery},
|
152 |
+
author={René Parlange and Juan C. Cuevas-Tello and Octavio Valenzuela and Omar de J. Cabrera-Rosas and Tomás Verdugo and Anupreeta More and Anton T. Jaelani},
|
153 |
+
year={2025},
|
154 |
+
eprint={2509.00226},
|
155 |
+
archivePrefix={arXiv},
|
156 |
+
primaryClass={cs.CV},
|
157 |
+
url={https://arxiv.org/abs/2509.00226},
|
158 |
+
}
|
159 |
+
```
|
160 |
+
|
161 |
+
---
|
162 |
+
|
163 |
+
|
164 |
+
## Model Card Contact
|
165 |
+
|
166 |
+
For questions about this model, please contact the author through: https://github.com/parlange/
|
config.json
ADDED
@@ -0,0 +1,76 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"architecture": "vit_base_patch16_224",
|
3 |
+
"num_classes": 2,
|
4 |
+
"num_features": 1000,
|
5 |
+
"global_pool": "avg",
|
6 |
+
"crop_pct": 0.875,
|
7 |
+
"interpolation": "bicubic",
|
8 |
+
"mean": [
|
9 |
+
0.485,
|
10 |
+
0.456,
|
11 |
+
0.406
|
12 |
+
],
|
13 |
+
"std": [
|
14 |
+
0.229,
|
15 |
+
0.224,
|
16 |
+
0.225
|
17 |
+
],
|
18 |
+
"first_conv": "conv1",
|
19 |
+
"classifier": "fc",
|
20 |
+
"input_size": [
|
21 |
+
3,
|
22 |
+
224,
|
23 |
+
224
|
24 |
+
],
|
25 |
+
"pool_size": [
|
26 |
+
7,
|
27 |
+
7
|
28 |
+
],
|
29 |
+
"pretrained_cfg": {
|
30 |
+
"tag": "gravit_c2",
|
31 |
+
"custom_load": false,
|
32 |
+
"input_size": [
|
33 |
+
3,
|
34 |
+
224,
|
35 |
+
224
|
36 |
+
],
|
37 |
+
"fixed_input_size": true,
|
38 |
+
"interpolation": "bicubic",
|
39 |
+
"crop_pct": 0.875,
|
40 |
+
"crop_mode": "center",
|
41 |
+
"mean": [
|
42 |
+
0.485,
|
43 |
+
0.456,
|
44 |
+
0.406
|
45 |
+
],
|
46 |
+
"std": [
|
47 |
+
0.229,
|
48 |
+
0.224,
|
49 |
+
0.225
|
50 |
+
],
|
51 |
+
"num_classes": 2,
|
52 |
+
"pool_size": [
|
53 |
+
7,
|
54 |
+
7
|
55 |
+
],
|
56 |
+
"first_conv": "conv1",
|
57 |
+
"classifier": "fc"
|
58 |
+
},
|
59 |
+
"model_name": "mlp-mixer_gravit_c2",
|
60 |
+
"experiment": "c2",
|
61 |
+
"training_strategy": "half",
|
62 |
+
"dataset": "C21+J24",
|
63 |
+
"hyperparameters": {
|
64 |
+
"batch_size": "192",
|
65 |
+
"learning_rate": "AdamW with ReduceLROnPlateau",
|
66 |
+
"epochs": "100",
|
67 |
+
"patience": "10",
|
68 |
+
"optimizer": "AdamW",
|
69 |
+
"scheduler": "ReduceLROnPlateau",
|
70 |
+
"image_size": "224x224",
|
71 |
+
"fine_tune_mode": "half",
|
72 |
+
"stochastic_depth_probability": "0.1"
|
73 |
+
},
|
74 |
+
"hf_hub_id": "parlange/mlp-mixer-gravit-c2",
|
75 |
+
"license": "apache-2.0"
|
76 |
+
}
|
confusion_matrices/MLP-Mixer_Confusion_Matrix_a.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_b.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_c.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_d.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_e.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_f.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_g.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_h.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_i.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_j.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_k.png
ADDED
![]() |
confusion_matrices/MLP-Mixer_Confusion_Matrix_l.png
ADDED
![]() |
evaluation_results.csv
ADDED
@@ -0,0 +1,133 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
Model,Dataset,Loss,Accuracy,AUCROC,F1
|
2 |
+
ViT,a,0.35447569389258865,0.8949115044247787,0.9020846228498507,0.7480106100795756
|
3 |
+
ViT,b,0.2228443425890036,0.9264382269726501,0.9263609576427256,0.5465116279069767
|
4 |
+
ViT,c,0.46229862214933587,0.8349575605155611,0.8684438305709024,0.34944237918215615
|
5 |
+
ViT,d,0.11673173789463115,0.9537881169443572,0.9704917127071824,0.6573426573426573
|
6 |
+
ViT,e,0.3652098159562351,0.8825466520307355,0.920767426019829,0.7249357326478149
|
7 |
+
ViT,f,0.24608832064126923,0.9108519842016175,0.9203345361214339,0.22926829268292684
|
8 |
+
ViT,g,0.10483654439449311,0.9635,0.997473,0.9644999189495866
|
9 |
+
ViT,h,0.23178722894191742,0.915,0.9939590555555555,0.9210526315789473
|
10 |
+
ViT,i,0.04857917896906535,0.978,0.9990572222222223,0.9782966129562644
|
11 |
+
ViT,j,2.494326035181681,0.6106666666666667,0.5831323333333334,0.42349457058242845
|
12 |
+
ViT,k,2.4380686638752618,0.6251666666666666,0.7805802777777779,0.43278688524590164
|
13 |
+
ViT,l,1.0272723743838732,0.8127329565949261,0.7993805230717175,0.7184308053873272
|
14 |
+
MLP-Mixer,a,1.230455079964832,0.6227876106194691,0.8958911227772556,0.49028400597907323
|
15 |
+
MLP-Mixer,b,1.0728926989350893,0.7004086765168186,0.9182900552486188,0.25604996096799376
|
16 |
+
MLP-Mixer,c,1.374837134027586,0.5576862621817039,0.8979152854511969,0.18904899135446687
|
17 |
+
MLP-Mixer,d,0.09552026474693218,0.9603898145237346,0.9868913443830571,0.7224669603524229
|
18 |
+
MLP-Mixer,e,0.9593323631422711,0.7069154774972558,0.9188677817301143,0.5512605042016807
|
19 |
+
MLP-Mixer,f,0.9257462782946794,0.7154410381794245,0.9306221006103087,0.09779367918902802
|
20 |
+
MLP-Mixer,g,0.5643243643840155,0.8425,0.991425611111111,0.8635773061931572
|
21 |
+
MLP-Mixer,h,0.7244052359660467,0.7668333333333334,0.9891666111111111,0.8104592873594364
|
22 |
+
MLP-Mixer,i,0.04615406060218811,0.9803333333333333,0.9994367777777778,0.980655737704918
|
23 |
+
MLP-Mixer,j,3.0292422666549683,0.45216666666666666,0.392282,0.28309705561613957
|
24 |
+
MLP-Mixer,k,2.5110719747940697,0.59,0.7661271111111111,0.3453964874933475
|
25 |
+
MLP-Mixer,l,1.4846716919555334,0.6762053625105207,0.7295511702036557,0.5855010004617516
|
26 |
+
CvT,a,0.7465745627352621,0.6493362831858407,0.7317079694031161,0.4389380530973451
|
27 |
+
CvT,b,0.7336456650122649,0.6765168186104998,0.7552670349907918,0.1942051683633516
|
28 |
+
CvT,c,0.8642418710588097,0.5919522162841874,0.6964806629834255,0.16041397153945666
|
29 |
+
CvT,d,0.06205783033066015,0.9761081420936812,0.9876427255985267,0.7654320987654321
|
30 |
+
CvT,e,0.6019917449757506,0.7178924259055982,0.7936123514720351,0.4910891089108911
|
31 |
+
CvT,f,0.5685286294680824,0.7414895617829603,0.8061353821076506,0.08274941608274941
|
32 |
+
CvT,g,0.4509977758725484,0.8055,0.9201512777777776,0.8277999114652501
|
33 |
+
CvT,h,0.5202355206807454,0.7606666666666667,0.9072719444444444,0.7961964235026966
|
34 |
+
CvT,i,0.09494428576032321,0.9643333333333334,0.9977035555555557,0.9632554945054945
|
35 |
+
CvT,j,2.988422914981842,0.3456666666666667,0.14668444444444442,0.022896963663514187
|
36 |
+
CvT,k,2.6323694267769655,0.5045,0.6181494444444444,0.0300163132137031
|
37 |
+
CvT,l,1.337245315202257,0.645425033064807,0.6032419706344807,0.5021944632005402
|
38 |
+
Swin,a,0.47572549887463056,0.8407079646017699,0.905882487792577,0.6742081447963801
|
39 |
+
Swin,b,0.24361524523634911,0.9163784973278843,0.9362615101289135,0.5283687943262412
|
40 |
+
Swin,c,0.4370936370240709,0.8535051870480981,0.9087605893186003,0.3900523560209424
|
41 |
+
Swin,d,0.038348094671021904,0.9880540710468406,0.9911620626151013,0.8869047619047619
|
42 |
+
Swin,e,0.3579506372581067,0.8781558726673985,0.9260273972602739,0.7286063569682152
|
43 |
+
Swin,f,0.24650774364781286,0.9156479217603912,0.9413092437445593,0.24937238493723848
|
44 |
+
Swin,g,0.11494702147444089,0.9593333333333334,0.9989898888888888,0.9607969151670951
|
45 |
+
Swin,h,0.2175228010714054,0.926,0.9979807777777777,0.9308841843088418
|
46 |
+
Swin,i,0.006121216081082821,0.9973333333333333,0.9999798888888889,0.9973315543695798
|
47 |
+
Swin,j,2.5422419211069744,0.5825,0.4893003333333333,0.3679031037093111
|
48 |
+
Swin,k,2.433416116627554,0.6205,0.7913794999999999,0.39036144578313253
|
49 |
+
Swin,l,1.035569912688268,0.8089455332451605,0.7797953948083542,0.7088143668682426
|
50 |
+
CaiT,a,0.3509529214517205,0.9081858407079646,0.8966973093999068,0.7726027397260274
|
51 |
+
CaiT,b,0.1907231829655279,0.9380697893744105,0.9234548802946593,0.5887265135699373
|
52 |
+
CaiT,c,0.3048490960337163,0.90883370009431,0.8791160220994475,0.493006993006993
|
53 |
+
CaiT,d,0.06549901952829443,0.9849104055328513,0.969243093922652,0.8545454545454545
|
54 |
+
CaiT,e,0.31167979835318943,0.9187705817782656,0.9264058124574283,0.7921348314606742
|
55 |
+
CaiT,f,0.1541684599891403,0.9499717886025955,0.9222261921687871,0.3464373464373464
|
56 |
+
CaiT,g,0.07805611325552066,0.9708333333333333,0.9986172777777778,0.9714937286202965
|
57 |
+
CaiT,h,0.13856186520308256,0.9553333333333334,0.997130611111111,0.9569961489088575
|
58 |
+
CaiT,i,0.011666435472667217,0.9956666666666667,0.9999013333333333,0.9956594323873121
|
59 |
+
CaiT,j,1.8389671653707822,0.6116666666666667,0.7423962222222222,0.4151606425702811
|
60 |
+
CaiT,k,1.7725774958133698,0.6365,0.8888650555555555,0.4312907431551499
|
61 |
+
CaiT,l,0.7395369254032035,0.8362991463268006,0.8693810723675515,0.7436693965922997
|
62 |
+
DeiT,a,0.48058320357736234,0.8263274336283186,0.8941450218931248,0.6594360086767896
|
63 |
+
DeiT,b,0.23002449519573911,0.9251807607670544,0.9313581952117864,0.5608856088560885
|
64 |
+
DeiT,c,0.49494195908204974,0.8154668343288274,0.8907605893186004,0.34118967452300786
|
65 |
+
DeiT,d,0.05036040664735698,0.9849104055328513,0.9769023941068141,0.8636363636363636
|
66 |
+
DeiT,e,0.338863200106291,0.8792535675082327,0.9161961704382048,0.7342995169082126
|
67 |
+
DeiT,f,0.26403015722496653,0.9037050968591311,0.9291450866890099,0.2289156626506024
|
68 |
+
DeiT,g,0.10851164469867945,0.9641666666666666,0.9990410000000001,0.9653393519264872
|
69 |
+
DeiT,h,0.2489620513096452,0.906,0.9981344444444444,0.9139194139194139
|
70 |
+
DeiT,i,0.013259729760388533,0.9958333333333333,0.9998315555555556,0.9958423415932147
|
71 |
+
DeiT,j,1.2026229511300723,0.7143333333333334,0.7246498888888889,0.6356292517006803
|
72 |
+
DeiT,k,1.1073710439900557,0.746,0.8698901111111111,0.6623836951705804
|
73 |
+
DeiT,l,0.5658274294531473,0.8476012985451485,0.867833726587774,0.7854785478547854
|
74 |
+
DeiT3,a,0.39277621998196155,0.8661504424778761,0.9195532732705195,0.7125890736342043
|
75 |
+
DeiT3,b,0.338128161960636,0.8824269097767997,0.9331012891344382,0.44510385756676557
|
76 |
+
DeiT3,c,0.323060417608134,0.8883998742533794,0.922292817679558,0.4580152671755725
|
77 |
+
DeiT3,d,0.12409640010358478,0.9553599497013517,0.9608121546961326,0.6787330316742082
|
78 |
+
DeiT3,e,0.24973662732461413,0.9209659714599341,0.9483084840687203,0.8064516129032258
|
79 |
+
DeiT3,f,0.2540075041596123,0.9116042881324055,0.9380772021883802,0.24193548387096775
|
80 |
+
DeiT3,g,0.1656125110021482,0.9416666666666667,0.9990236666666666,0.944760101010101
|
81 |
+
DeiT3,h,0.15762409150910875,0.9448333333333333,0.9990646111111111,0.9476017096723128
|
82 |
+
DeiT3,i,0.05214000094247361,0.9803333333333333,0.9997376666666667,0.9806684141546527
|
83 |
+
DeiT3,j,1.1591287109454473,0.696,0.7744774999999999,0.6248457424928013
|
84 |
+
DeiT3,k,1.0456561943689981,0.7346666666666667,0.845634,0.6561555075593952
|
85 |
+
DeiT3,l,0.5223108836063022,0.854033906456655,0.8898184372191467,0.7933968686181075
|
86 |
+
Twins_SVT,a,0.4211153812640536,0.8307522123893806,0.8825833123189902,0.6433566433566433
|
87 |
+
Twins_SVT,b,0.3625493723054758,0.8550770198050928,0.8962191528545118,0.37449118046132973
|
88 |
+
Twins_SVT,c,0.47319920195681764,0.7868594781515247,0.8548139963167587,0.2893081761006289
|
89 |
+
Twins_SVT,d,0.1203458983801289,0.9783087079534738,0.9818324125230202,0.8
|
90 |
+
Twins_SVT,e,0.5213294555274637,0.7486278814489572,0.8316203738742148,0.5465346534653466
|
91 |
+
Twins_SVT,f,0.3335461875583885,0.8666541282678202,0.9034523383543173,0.16292798110979928
|
92 |
+
Twins_SVT,g,0.2639119902451833,0.9085,0.9744078888888889,0.912676952441546
|
93 |
+
Twins_SVT,h,0.32257486327489215,0.8723333333333333,0.9662636666666669,0.8822263222632226
|
94 |
+
Twins_SVT,i,0.13550377811988196,0.9738333333333333,0.9972788888888889,0.9733672603901612
|
95 |
+
Twins_SVT,j,1.2430085968176523,0.49,0.43771377777777776,0.1896186440677966
|
96 |
+
Twins_SVT,k,1.1146003757913907,0.5553333333333333,0.7234002222222222,0.2115839243498818
|
97 |
+
Twins_SVT,l,0.6286477774643219,0.7480461704941685,0.7275090480198628,0.6162439337057046
|
98 |
+
Twins_PCPVT,a,0.45601994748664115,0.7699115044247787,0.8394007473464615,0.5458515283842795
|
99 |
+
Twins_PCPVT,b,0.3125818614145001,0.8773970449544168,0.9010699815837937,0.390625
|
100 |
+
Twins_PCPVT,c,0.5049686531944119,0.7500785916378497,0.8135911602209945,0.23923444976076555
|
101 |
+
Twins_PCPVT,d,0.3149096430453517,0.8918579063187677,0.9015690607734806,0.4208754208754209
|
102 |
+
Twins_PCPVT,e,0.42039827045572575,0.8079034028540066,0.8655339438431847,0.5882352941176471
|
103 |
+
Twins_PCPVT,f,0.3770137148085496,0.8412638706037239,0.8693597175042401,0.12899896800825594
|
104 |
+
Twins_PCPVT,g,0.2785677030881246,0.9015,0.9626754444444443,0.9027480664801711
|
105 |
+
Twins_PCPVT,h,0.3805647597312927,0.834,0.928301,0.8463437210737427
|
106 |
+
Twins_PCPVT,i,0.2798018006483714,0.9091666666666667,0.9656723333333334,0.9096335599403084
|
107 |
+
Twins_PCPVT,j,0.614702238559723,0.6835,0.7995154444444446,0.6018033130635353
|
108 |
+
Twins_PCPVT,k,0.6159363424777985,0.6911666666666667,0.7903985,0.6076646199449502
|
109 |
+
Twins_PCPVT,l,0.45535326129802217,0.7889864133702056,0.8498913163479216,0.7103004291845494
|
110 |
+
PiT,a,0.3937257931823224,0.8296460176991151,0.8874127904755356,0.641860465116279
|
111 |
+
PiT,b,0.2796248870145521,0.8777114115058158,0.91848802946593,0.4150375939849624
|
112 |
+
PiT,c,0.5313189482209218,0.7613957874882112,0.8498581952117863,0.26666666666666666
|
113 |
+
PiT,d,0.049343678185640734,0.9798805407104684,0.9911620626151012,0.8117647058823529
|
114 |
+
PiT,e,0.3259278782832505,0.8518111964873765,0.9145841216983274,0.6715328467153284
|
115 |
+
PiT,f,0.2841162405192056,0.8750235094978371,0.9172267022129574,0.17196261682242991
|
116 |
+
PiT,g,0.1590204114516576,0.9338333333333333,0.9916004444444445,0.9369340746624305
|
117 |
+
PiT,h,0.2924602138201396,0.8721666666666666,0.981646111111111,0.8849212303075769
|
118 |
+
PiT,i,0.03693298858900865,0.988,0.999485,0.9879396984924623
|
119 |
+
PiT,j,2.9977854507366817,0.461,0.277717,0.06477732793522267
|
120 |
+
PiT,k,2.8756980224698783,0.5151666666666667,0.7229978888888889,0.07149696776252792
|
121 |
+
PiT,l,1.2244331041709067,0.7434170975111218,0.6790239785353327,0.599849990624414
|
122 |
+
Ensemble,a,,0.9070796460176991,0.941851401847734,0.79
|
123 |
+
Ensemble,b,,0.9374410562716127,0.9600349907918969,0.6135922330097088
|
124 |
+
Ensemble,c,,0.895001571832757,0.9307624309392265,0.48615384615384616
|
125 |
+
Ensemble,d,,0.9911977365608299,0.9944677716390424,0.9186046511627907
|
126 |
+
Ensemble,e,,0.9264544456641054,0.955384848255506,0.825065274151436
|
127 |
+
Ensemble,f,,0.941696445363927,0.9599335198386041,0.33760683760683763
|
128 |
+
Ensemble,g,,0.9701666666666666,0.9990522222222222,0.9710027539283979
|
129 |
+
Ensemble,h,,0.9476666666666667,0.9979163333333333,0.9502219403931516
|
130 |
+
Ensemble,i,,0.9986666666666667,0.9999886666666667,0.9986671109630123
|
131 |
+
Ensemble,j,,0.5698333333333333,0.6426453333333333,0.31556616282153277
|
132 |
+
Ensemble,k,,0.5983333333333334,0.8897323333333333,0.33055555555555555
|
133 |
+
Ensemble,l,,0.8179632078874595,0.832089495815299,0.712386018237082
|
mlp-mixer-gravit-c2.pth
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:563e9b9cbd996faaa6d334a672f02901adc4ccbb3963d4bd3a3444325e3a7ed2
|
3 |
+
size 236510588
|
model.safetensors
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:737b37969b009897b8f53149aefc30525d4cfe9cf53ae31da21a53083819787b
|
3 |
+
size 236466584
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:563e9b9cbd996faaa6d334a672f02901adc4ccbb3963d4bd3a3444325e3a7ed2
|
3 |
+
size 236510588
|
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_a.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_b.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_c.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_d.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_e.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_f.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_g.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_h.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_i.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_j.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_k.png
ADDED
![]() |
roc_confusion_matrix/MLP-Mixer_roc_confusion_matrix_l.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_a.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_b.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_c.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_d.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_e.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_f.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_g.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_h.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_i.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_j.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_k.png
ADDED
![]() |
roc_curves/MLP-Mixer_ROC_l.png
ADDED
![]() |
training_curves/MLP-Mixer_accuracy.png
ADDED
![]() |
training_curves/MLP-Mixer_auc.png
ADDED
![]() |
training_curves/MLP-Mixer_combined_metrics.png
ADDED
![]() |
Git LFS Details
|
training_curves/MLP-Mixer_f1.png
ADDED
![]() |
training_curves/MLP-Mixer_loss.png
ADDED
![]() |
training_curves/MLP-Mixer_metrics.csv
ADDED
@@ -0,0 +1,54 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
|
2 |
+
1,0.29145442796129817,0.2269907153643255,0.8623319766049868,0.9081632653061225,0.9466478216786088,0.9874839565147174,0.8610343875155365,0.9142857142857143
|
3 |
+
2,0.20204957357614822,0.10167550577974875,0.9114991278174915,0.9635568513119533,0.9747509783167152,0.9931969247507415,0.9106310226919491,0.9633431085043989
|
4 |
+
3,0.17887086618263434,0.10648491796182126,0.9244108492663407,0.9577259475218659,0.9801585840676545,0.9932181744001223,0.9235849382801424,0.956973293768546
|
5 |
+
4,0.16577725969266438,0.13664596892741263,0.9303109074118412,0.9467930029154519,0.9829345377749036,0.9910804596724153,0.9295518982089759,0.9483368719037509
|
6 |
+
5,0.15496329781000961,0.11067834930934295,0.9354670451824743,0.9599125364431487,0.9851292290649301,0.9946939625496178,0.934942459376751,0.9605734767025089
|
7 |
+
6,0.14839284566777405,0.12546293252845547,0.9377330095427028,0.9497084548104956,0.9863448689669855,0.9933637344983808,0.9371970677015955,0.9510985116938342
|
8 |
+
7,0.13993086738980623,0.08488189991639584,0.9415637719328248,0.9642857142857143,0.9878184945854998,0.9957883194927284,0.9411430343116991,0.9644153957879448
|
9 |
+
8,0.13663571901683397,0.08697146258748654,0.9430259602558402,0.9635568513119533,0.9883418879111432,0.995857380853216,0.9425707415036932,0.9637155297532656
|
10 |
+
9,0.13304687339368979,0.07450944367720157,0.9444710469610426,0.9686588921282799,0.9889594265582636,0.9965947436867292,0.9440808734887887,0.9686817188638019
|
11 |
+
10,0.1299397553877151,0.07181748729362085,0.9463436741115709,0.9708454810495627,0.9894615320635669,0.9973161692832069,0.9459624708283458,0.9711399711399712
|
12 |
+
11,0.12500852142336508,0.09628865459483149,0.9483702158224168,0.9628279883381924,0.9902504191294349,0.9953410143732628,0.9480083351989943,0.9632299927901946
|
13 |
+
12,0.12014142812203286,0.07017097205640449,0.9506190785648322,0.9730320699708455,0.9909755883820177,0.9969517377963264,0.9502879425664333,0.9728937728937729
|
14 |
+
13,0.11813922882855793,0.10989063135388989,0.9515083626911106,0.956997084548105,0.9912938098195209,0.9963078734200885,0.9512125878577757,0.958303886925795
|
15 |
+
14,0.11611644936989314,0.060274891091005624,0.9523035879194172,0.9752186588921283,0.9916136222684293,0.9982798408826256,0.9520419568394807,0.9754335260115607
|
16 |
+
15,0.11372668762419379,0.058446260141996186,0.9529277969695933,0.9788629737609329,0.9919547533927702,0.9980407823270915,0.9526439391984309,0.978909090909091
|
17 |
+
16,0.11348895668651156,0.11504732571954977,0.9536033108732086,0.9577259475218659,0.9918929740397044,0.9955853853411419,0.9532821325251412,0.9588652482269504
|
18 |
+
17,0.1095430688082603,0.061709550733382086,0.95509970243185,0.9737609329446064,0.9924817879117946,0.997915409395745,0.9548289418221545,0.9738372093023255
|
19 |
+
18,0.10819893803116329,0.06419725399043509,0.9558179703799979,0.9759475218658892,0.9926413665019607,0.9971982337291434,0.9555278220080045,0.9759650400582666
|
20 |
+
19,0.10534134136542372,0.048838651649458414,0.9571689981872286,0.9803206997084548,0.9930328581091843,0.9986166478253109,0.9569129398811214,0.9802197802197802
|
21 |
+
20,0.10514715186596389,0.05290277775834551,0.9566217464172111,0.9788629737609329,0.9930816764223723,0.9981013438278268,0.9563879265136992,0.9788475565280816
|
22 |
+
21,0.10394048473272813,0.059950689206839304,0.9577761056195916,0.978134110787172,0.9932008848529255,0.9972800448792596,0.95754011246969,0.9781021897810219
|
23 |
+
22,0.10062678119293637,0.0480658602479943,0.9588193043061873,0.9825072886297376,0.9936592004014908,0.9985443990174162,0.9586012447134065,0.982532751091703
|
24 |
+
23,0.09962694599266829,0.08769501168986799,0.9593665560762048,0.9642857142857143,0.9937719400838347,0.9965224948788345,0.9591282060103555,0.9651741293532339
|
25 |
+
24,0.09795052473298477,0.05549548572857943,0.9604012039538941,0.9795918367346939,0.9939670729532274,0.9975392906017051,0.9601973373213349,0.9796215429403202
|
26 |
+
25,0.09846853324300911,0.05438389996224172,0.9600591715976331,0.9803206997084548,0.9938774731279374,0.9976667884979898,0.9598521638231123,0.9803063457330415
|
27 |
+
26,0.09525587915044993,0.04732373579196176,0.9610681670486028,0.9810495626822158,0.9943149256620569,0.998217154416952,0.9608610062839017,0.9808823529411764
|
28 |
+
27,0.09438318994134388,0.05725980864745448,0.9613246913157985,0.9774052478134111,0.9944237603472904,0.9977039753844061,0.9611329283068806,0.9776496034607065
|
29 |
+
28,0.0953016054101448,0.04482401199856061,0.9608800492526594,0.9803206997084548,0.9943009766742883,0.9988992681620753,0.9606658011709984,0.9803921568627451
|
30 |
+
29,0.09261291457813667,0.05096099536550983,0.9630947087594487,0.9795918367346939,0.9946161501843357,0.9987462706865338,0.9629209621993127,0.9797979797979798
|
31 |
+
30,0.09029058309192105,0.049803554582743534,0.9633170297910182,0.9832361516034985,0.9948369229044387,0.9986070854830895,0.9631101021566402,0.9831748354059985
|
32 |
+
31,0.09143641128905206,0.04432327675282868,0.9630776071416356,0.9839650145772595,0.9947710997794625,0.9988036447398617,0.9629076040270762,0.983941605839416
|
33 |
+
32,0.09062160272046113,0.04714706765912723,0.9637445702363444,0.9817784256559767,0.9947959025943932,0.9986081479655585,0.9635494575402761,0.9818971759594497
|
34 |
+
33,0.08836965283034487,0.03254114539069246,0.9642148647262031,0.9868804664723032,0.9950900234262274,0.9992987615704341,0.9640467006297196,0.9868995633187773
|
35 |
+
34,0.08933821861992217,0.04119756786999871,0.9635393508225878,0.9846938775510204,0.9949931605168868,0.9989810793121914,0.9633676975945017,0.9847494553376906
|
36 |
+
35,0.08831159190945338,0.033931745015323686,0.9639241372233813,0.989067055393586,0.9950557021808398,0.9993030115003102,0.9637147059076484,0.9890909090909091
|
37 |
+
36,0.08774512386364935,0.057267868484074454,0.9647535656873141,0.9817784256559767,0.995151173832597,0.9986262101675323,0.9645541319116003,0.9819233550253073
|
38 |
+
37,0.08685428993263906,0.036162022510733534,0.964573998700277,0.9861516034985423,0.9952607266664529,0.9991000773487237,0.9643940630989111,0.9862218999274837
|
39 |
+
38,0.08388243771738575,0.041764807689163845,0.9658993740807881,0.9832361516034985,0.9955869784708045,0.998943892425775,0.9657635383400295,0.9831006612784717
|
40 |
+
39,0.08456420776779766,0.0455608130500546,0.966472278277525,0.9846938775510204,0.9955322102429058,0.9989577046978724,0.9663199305955213,0.9846827133479212
|
41 |
+
40,0.07450469075414327,0.040536254166514116,0.9697728905154428,0.9868804664723032,0.9965213971631838,0.999136201752671,0.9696678479187939,0.9868995633187773
|
42 |
+
41,0.06978176950217801,0.047243893678699224,0.97191914355098,0.9846938775510204,0.9969264979134604,0.9982044046273237,0.9718097069376964,0.9847715736040609
|
43 |
+
42,0.06850260023473491,0.04075182299876352,0.9721927694359886,0.9839650145772595,0.9970457331045915,0.9991330143052639,0.9721030779260886,0.9840348330914369
|
44 |
+
43,0.06811601509771,0.03997669918531363,0.9724150904675583,0.9868804664723032,0.997076006044856,0.9990894525240335,0.9723251664264635,0.9868995633187773
|
45 |
+
44,0.06806967132065916,0.03781464079031096,0.9726716147347539,0.9861516034985423,0.9970822718221248,0.9991606388494589,0.9725858194232386,0.9862218999274837
|
46 |
+
45,0.0670293473860013,0.03742438893624019,0.9728169784861648,0.9854227405247813,0.9971530266437589,0.9992541373067344,0.9727051833535104,0.9854439592430859
|
47 |
+
46,0.06626332356828091,0.03850126091483166,0.9731846632691452,0.9854227405247813,0.9972171325585393,0.9992127004904419,0.9731050925370063,0.9854439592430859
|
48 |
+
47,0.06726663148398683,0.03823723487621146,0.9732445189314909,0.9861516034985423,0.9971340963864588,0.9992084505605658,0.9731400169967294,0.9861818181818182
|
49 |
+
48,0.06450882592521447,0.040102039715825,0.9743475732804323,0.9868804664723032,0.9973629263802207,0.9991574514020518,0.9742559983523839,0.9869186046511628
|
50 |
+
49,0.06477270891262592,0.04033362669849893,0.9742877176180866,0.9868804664723032,0.9973337512085793,0.9991468265773613,0.9741970361344465,0.9869186046511628
|
51 |
+
50,0.06408690062026325,0.04006394326589111,0.9743646748982454,0.9861516034985423,0.9973965519968166,0.9991510765072376,0.9742669779578383,0.9862018881626725
|
52 |
+
51,0.06347990205634599,0.04069524004637283,0.9741509046755823,0.9861516034985423,0.9974827519126241,0.9991362017526711,0.9740441498450205,0.9862018881626725
|
53 |
+
52,0.06411877819348259,0.04053805566477011,0.9745100386496562,0.9854227405247813,0.9974009598880651,0.9991362017526711,0.9744390043130429,0.9854651162790697
|
54 |
+
53,0.06501926076397624,0.04011943247766606,0.9742706160002736,0.9854227405247813,0.9973301320963308,0.9991436391299543,0.9741896192346952,0.9854651162790697
|
training_metrics.csv
ADDED
@@ -0,0 +1,54 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
|
2 |
+
1,0.29145442796129817,0.2269907153643255,0.8623319766049868,0.9081632653061225,0.9466478216786088,0.9874839565147174,0.8610343875155365,0.9142857142857143
|
3 |
+
2,0.20204957357614822,0.10167550577974875,0.9114991278174915,0.9635568513119533,0.9747509783167152,0.9931969247507415,0.9106310226919491,0.9633431085043989
|
4 |
+
3,0.17887086618263434,0.10648491796182126,0.9244108492663407,0.9577259475218659,0.9801585840676545,0.9932181744001223,0.9235849382801424,0.956973293768546
|
5 |
+
4,0.16577725969266438,0.13664596892741263,0.9303109074118412,0.9467930029154519,0.9829345377749036,0.9910804596724153,0.9295518982089759,0.9483368719037509
|
6 |
+
5,0.15496329781000961,0.11067834930934295,0.9354670451824743,0.9599125364431487,0.9851292290649301,0.9946939625496178,0.934942459376751,0.9605734767025089
|
7 |
+
6,0.14839284566777405,0.12546293252845547,0.9377330095427028,0.9497084548104956,0.9863448689669855,0.9933637344983808,0.9371970677015955,0.9510985116938342
|
8 |
+
7,0.13993086738980623,0.08488189991639584,0.9415637719328248,0.9642857142857143,0.9878184945854998,0.9957883194927284,0.9411430343116991,0.9644153957879448
|
9 |
+
8,0.13663571901683397,0.08697146258748654,0.9430259602558402,0.9635568513119533,0.9883418879111432,0.995857380853216,0.9425707415036932,0.9637155297532656
|
10 |
+
9,0.13304687339368979,0.07450944367720157,0.9444710469610426,0.9686588921282799,0.9889594265582636,0.9965947436867292,0.9440808734887887,0.9686817188638019
|
11 |
+
10,0.1299397553877151,0.07181748729362085,0.9463436741115709,0.9708454810495627,0.9894615320635669,0.9973161692832069,0.9459624708283458,0.9711399711399712
|
12 |
+
11,0.12500852142336508,0.09628865459483149,0.9483702158224168,0.9628279883381924,0.9902504191294349,0.9953410143732628,0.9480083351989943,0.9632299927901946
|
13 |
+
12,0.12014142812203286,0.07017097205640449,0.9506190785648322,0.9730320699708455,0.9909755883820177,0.9969517377963264,0.9502879425664333,0.9728937728937729
|
14 |
+
13,0.11813922882855793,0.10989063135388989,0.9515083626911106,0.956997084548105,0.9912938098195209,0.9963078734200885,0.9512125878577757,0.958303886925795
|
15 |
+
14,0.11611644936989314,0.060274891091005624,0.9523035879194172,0.9752186588921283,0.9916136222684293,0.9982798408826256,0.9520419568394807,0.9754335260115607
|
16 |
+
15,0.11372668762419379,0.058446260141996186,0.9529277969695933,0.9788629737609329,0.9919547533927702,0.9980407823270915,0.9526439391984309,0.978909090909091
|
17 |
+
16,0.11348895668651156,0.11504732571954977,0.9536033108732086,0.9577259475218659,0.9918929740397044,0.9955853853411419,0.9532821325251412,0.9588652482269504
|
18 |
+
17,0.1095430688082603,0.061709550733382086,0.95509970243185,0.9737609329446064,0.9924817879117946,0.997915409395745,0.9548289418221545,0.9738372093023255
|
19 |
+
18,0.10819893803116329,0.06419725399043509,0.9558179703799979,0.9759475218658892,0.9926413665019607,0.9971982337291434,0.9555278220080045,0.9759650400582666
|
20 |
+
19,0.10534134136542372,0.048838651649458414,0.9571689981872286,0.9803206997084548,0.9930328581091843,0.9986166478253109,0.9569129398811214,0.9802197802197802
|
21 |
+
20,0.10514715186596389,0.05290277775834551,0.9566217464172111,0.9788629737609329,0.9930816764223723,0.9981013438278268,0.9563879265136992,0.9788475565280816
|
22 |
+
21,0.10394048473272813,0.059950689206839304,0.9577761056195916,0.978134110787172,0.9932008848529255,0.9972800448792596,0.95754011246969,0.9781021897810219
|
23 |
+
22,0.10062678119293637,0.0480658602479943,0.9588193043061873,0.9825072886297376,0.9936592004014908,0.9985443990174162,0.9586012447134065,0.982532751091703
|
24 |
+
23,0.09962694599266829,0.08769501168986799,0.9593665560762048,0.9642857142857143,0.9937719400838347,0.9965224948788345,0.9591282060103555,0.9651741293532339
|
25 |
+
24,0.09795052473298477,0.05549548572857943,0.9604012039538941,0.9795918367346939,0.9939670729532274,0.9975392906017051,0.9601973373213349,0.9796215429403202
|
26 |
+
25,0.09846853324300911,0.05438389996224172,0.9600591715976331,0.9803206997084548,0.9938774731279374,0.9976667884979898,0.9598521638231123,0.9803063457330415
|
27 |
+
26,0.09525587915044993,0.04732373579196176,0.9610681670486028,0.9810495626822158,0.9943149256620569,0.998217154416952,0.9608610062839017,0.9808823529411764
|
28 |
+
27,0.09438318994134388,0.05725980864745448,0.9613246913157985,0.9774052478134111,0.9944237603472904,0.9977039753844061,0.9611329283068806,0.9776496034607065
|
29 |
+
28,0.0953016054101448,0.04482401199856061,0.9608800492526594,0.9803206997084548,0.9943009766742883,0.9988992681620753,0.9606658011709984,0.9803921568627451
|
30 |
+
29,0.09261291457813667,0.05096099536550983,0.9630947087594487,0.9795918367346939,0.9946161501843357,0.9987462706865338,0.9629209621993127,0.9797979797979798
|
31 |
+
30,0.09029058309192105,0.049803554582743534,0.9633170297910182,0.9832361516034985,0.9948369229044387,0.9986070854830895,0.9631101021566402,0.9831748354059985
|
32 |
+
31,0.09143641128905206,0.04432327675282868,0.9630776071416356,0.9839650145772595,0.9947710997794625,0.9988036447398617,0.9629076040270762,0.983941605839416
|
33 |
+
32,0.09062160272046113,0.04714706765912723,0.9637445702363444,0.9817784256559767,0.9947959025943932,0.9986081479655585,0.9635494575402761,0.9818971759594497
|
34 |
+
33,0.08836965283034487,0.03254114539069246,0.9642148647262031,0.9868804664723032,0.9950900234262274,0.9992987615704341,0.9640467006297196,0.9868995633187773
|
35 |
+
34,0.08933821861992217,0.04119756786999871,0.9635393508225878,0.9846938775510204,0.9949931605168868,0.9989810793121914,0.9633676975945017,0.9847494553376906
|
36 |
+
35,0.08831159190945338,0.033931745015323686,0.9639241372233813,0.989067055393586,0.9950557021808398,0.9993030115003102,0.9637147059076484,0.9890909090909091
|
37 |
+
36,0.08774512386364935,0.057267868484074454,0.9647535656873141,0.9817784256559767,0.995151173832597,0.9986262101675323,0.9645541319116003,0.9819233550253073
|
38 |
+
37,0.08685428993263906,0.036162022510733534,0.964573998700277,0.9861516034985423,0.9952607266664529,0.9991000773487237,0.9643940630989111,0.9862218999274837
|
39 |
+
38,0.08388243771738575,0.041764807689163845,0.9658993740807881,0.9832361516034985,0.9955869784708045,0.998943892425775,0.9657635383400295,0.9831006612784717
|
40 |
+
39,0.08456420776779766,0.0455608130500546,0.966472278277525,0.9846938775510204,0.9955322102429058,0.9989577046978724,0.9663199305955213,0.9846827133479212
|
41 |
+
40,0.07450469075414327,0.040536254166514116,0.9697728905154428,0.9868804664723032,0.9965213971631838,0.999136201752671,0.9696678479187939,0.9868995633187773
|
42 |
+
41,0.06978176950217801,0.047243893678699224,0.97191914355098,0.9846938775510204,0.9969264979134604,0.9982044046273237,0.9718097069376964,0.9847715736040609
|
43 |
+
42,0.06850260023473491,0.04075182299876352,0.9721927694359886,0.9839650145772595,0.9970457331045915,0.9991330143052639,0.9721030779260886,0.9840348330914369
|
44 |
+
43,0.06811601509771,0.03997669918531363,0.9724150904675583,0.9868804664723032,0.997076006044856,0.9990894525240335,0.9723251664264635,0.9868995633187773
|
45 |
+
44,0.06806967132065916,0.03781464079031096,0.9726716147347539,0.9861516034985423,0.9970822718221248,0.9991606388494589,0.9725858194232386,0.9862218999274837
|
46 |
+
45,0.0670293473860013,0.03742438893624019,0.9728169784861648,0.9854227405247813,0.9971530266437589,0.9992541373067344,0.9727051833535104,0.9854439592430859
|
47 |
+
46,0.06626332356828091,0.03850126091483166,0.9731846632691452,0.9854227405247813,0.9972171325585393,0.9992127004904419,0.9731050925370063,0.9854439592430859
|
48 |
+
47,0.06726663148398683,0.03823723487621146,0.9732445189314909,0.9861516034985423,0.9971340963864588,0.9992084505605658,0.9731400169967294,0.9861818181818182
|
49 |
+
48,0.06450882592521447,0.040102039715825,0.9743475732804323,0.9868804664723032,0.9973629263802207,0.9991574514020518,0.9742559983523839,0.9869186046511628
|
50 |
+
49,0.06477270891262592,0.04033362669849893,0.9742877176180866,0.9868804664723032,0.9973337512085793,0.9991468265773613,0.9741970361344465,0.9869186046511628
|
51 |
+
50,0.06408690062026325,0.04006394326589111,0.9743646748982454,0.9861516034985423,0.9973965519968166,0.9991510765072376,0.9742669779578383,0.9862018881626725
|
52 |
+
51,0.06347990205634599,0.04069524004637283,0.9741509046755823,0.9861516034985423,0.9974827519126241,0.9991362017526711,0.9740441498450205,0.9862018881626725
|
53 |
+
52,0.06411877819348259,0.04053805566477011,0.9745100386496562,0.9854227405247813,0.9974009598880651,0.9991362017526711,0.9744390043130429,0.9854651162790697
|
54 |
+
53,0.06501926076397624,0.04011943247766606,0.9742706160002736,0.9854227405247813,0.9973301320963308,0.9991436391299543,0.9741896192346952,0.9854651162790697
|