Upload CvT model from experiment s2
Browse filesThis view is limited to 50 files because it contains too many changes.
See raw diff
- .gitattributes +2 -0
- README.md +165 -0
- config.json +76 -0
- confusion_matrices/CvT_Confusion_Matrix_a.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_b.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_c.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_d.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_e.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_f.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_g.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_h.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_i.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_j.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_k.png +0 -0
- confusion_matrices/CvT_Confusion_Matrix_l.png +0 -0
- cvt-gravit-s2.pth +3 -0
- evaluation_results.csv +133 -0
- model.safetensors +3 -0
- pytorch_model.bin +3 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_a.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_b.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_c.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_d.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_e.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_f.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_g.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_h.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_i.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_j.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_k.png +0 -0
- roc_confusion_matrix/CvT_roc_confusion_matrix_l.png +0 -0
- roc_curves/CvT_ROC_a.png +0 -0
- roc_curves/CvT_ROC_b.png +0 -0
- roc_curves/CvT_ROC_c.png +0 -0
- roc_curves/CvT_ROC_d.png +0 -0
- roc_curves/CvT_ROC_e.png +0 -0
- roc_curves/CvT_ROC_f.png +0 -0
- roc_curves/CvT_ROC_g.png +0 -0
- roc_curves/CvT_ROC_h.png +0 -0
- roc_curves/CvT_ROC_i.png +0 -0
- roc_curves/CvT_ROC_j.png +0 -0
- roc_curves/CvT_ROC_k.png +0 -0
- roc_curves/CvT_ROC_l.png +0 -0
- training_curves/CvT_accuracy.png +0 -0
- training_curves/CvT_auc.png +0 -0
- training_curves/CvT_combined_metrics.png +3 -0
- training_curves/CvT_f1.png +0 -0
- training_curves/CvT_loss.png +0 -0
- training_curves/CvT_metrics.csv +45 -0
- training_metrics.csv +45 -0
.gitattributes
CHANGED
|
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
| 36 |
+
training_curves/CvT_combined_metrics.png filter=lfs diff=lfs merge=lfs -text
|
| 37 |
+
training_notebook_s2.ipynb filter=lfs diff=lfs merge=lfs -text
|
README.md
ADDED
|
@@ -0,0 +1,165 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
tags:
|
| 4 |
+
- vision-transformer
|
| 5 |
+
- image-classification
|
| 6 |
+
- pytorch
|
| 7 |
+
- timm
|
| 8 |
+
- cvt
|
| 9 |
+
- gravitational-lensing
|
| 10 |
+
- strong-lensing
|
| 11 |
+
- astronomy
|
| 12 |
+
- astrophysics
|
| 13 |
+
datasets:
|
| 14 |
+
- parlange/gravit-c21
|
| 15 |
+
metrics:
|
| 16 |
+
- accuracy
|
| 17 |
+
- auc
|
| 18 |
+
- f1
|
| 19 |
+
paper:
|
| 20 |
+
- title: "GraViT: A Gravitational Lens Discovery Toolkit with Vision Transformers"
|
| 21 |
+
url: "https://arxiv.org/abs/2509.00226"
|
| 22 |
+
authors: "Parlange et al."
|
| 23 |
+
model-index:
|
| 24 |
+
- name: CvT-s2
|
| 25 |
+
results:
|
| 26 |
+
- task:
|
| 27 |
+
type: image-classification
|
| 28 |
+
name: Strong Gravitational Lens Discovery
|
| 29 |
+
dataset:
|
| 30 |
+
type: common-test-sample
|
| 31 |
+
name: Common Test Sample (More et al. 2024)
|
| 32 |
+
metrics:
|
| 33 |
+
- type: accuracy
|
| 34 |
+
value: 0.6852
|
| 35 |
+
name: Average Accuracy
|
| 36 |
+
- type: auc
|
| 37 |
+
value: 0.7569
|
| 38 |
+
name: Average AUC-ROC
|
| 39 |
+
- type: f1
|
| 40 |
+
value: 0.4201
|
| 41 |
+
name: Average F1-Score
|
| 42 |
+
---
|
| 43 |
+
|
| 44 |
+
# 🌌 cvt-gravit-s2
|
| 45 |
+
|
| 46 |
+
🔭 This model is part of **GraViT**: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery
|
| 47 |
+
|
| 48 |
+
🔗 **GitHub Repository**: [https://github.com/parlange/gravit](https://github.com/parlange/gravit)
|
| 49 |
+
|
| 50 |
+
## 🛰️ Model Details
|
| 51 |
+
|
| 52 |
+
- **🤖 Model Type**: CvT
|
| 53 |
+
- **🧪 Experiment**: S2 - C21-half-18660
|
| 54 |
+
- **🌌 Dataset**: C21
|
| 55 |
+
- **🪐 Fine-tuning Strategy**: half
|
| 56 |
+
|
| 57 |
+
- **🎲 Random Seed**: 18660
|
| 58 |
+
|
| 59 |
+
## 💻 Quick Start
|
| 60 |
+
|
| 61 |
+
```python
|
| 62 |
+
import torch
|
| 63 |
+
import timm
|
| 64 |
+
|
| 65 |
+
# Load the model directly from the Hub
|
| 66 |
+
model = timm.create_model(
|
| 67 |
+
'hf-hub:parlange/cvt-gravit-s2',
|
| 68 |
+
pretrained=True
|
| 69 |
+
)
|
| 70 |
+
model.eval()
|
| 71 |
+
|
| 72 |
+
# Example inference
|
| 73 |
+
dummy_input = torch.randn(1, 3, 224, 224)
|
| 74 |
+
with torch.no_grad():
|
| 75 |
+
output = model(dummy_input)
|
| 76 |
+
predictions = torch.softmax(output, dim=1)
|
| 77 |
+
print(f"Lens probability: {predictions[0][1]:.4f}")
|
| 78 |
+
```
|
| 79 |
+
|
| 80 |
+
## ⚡️ Training Configuration
|
| 81 |
+
|
| 82 |
+
**Training Dataset:** C21 (Cañameras et al. 2021)
|
| 83 |
+
**Fine-tuning Strategy:** half
|
| 84 |
+
|
| 85 |
+
|
| 86 |
+
| 🔧 Parameter | 📝 Value |
|
| 87 |
+
|--------------|----------|
|
| 88 |
+
| Batch Size | 192 |
|
| 89 |
+
| Learning Rate | AdamW with ReduceLROnPlateau |
|
| 90 |
+
| Epochs | 100 |
|
| 91 |
+
| Patience | 10 |
|
| 92 |
+
| Optimizer | AdamW |
|
| 93 |
+
| Scheduler | ReduceLROnPlateau |
|
| 94 |
+
| Image Size | 224x224 |
|
| 95 |
+
| Fine Tune Mode | half |
|
| 96 |
+
| Stochastic Depth Probability | 0.1 |
|
| 97 |
+
|
| 98 |
+
|
| 99 |
+
## 📈 Training Curves
|
| 100 |
+
|
| 101 |
+

|
| 102 |
+
|
| 103 |
+
|
| 104 |
+
## 🏁 Final Epoch Training Metrics
|
| 105 |
+
|
| 106 |
+
| Metric | Training | Validation |
|
| 107 |
+
|:---------:|:-----------:|:-------------:|
|
| 108 |
+
| 📉 Loss | 0.3223 | 0.2497 |
|
| 109 |
+
| 🎯 Accuracy | 0.8233 | 0.8970 |
|
| 110 |
+
| 📊 AUC-ROC | 0.9285 | 0.9638 |
|
| 111 |
+
| ⚖️ F1 Score | 0.8232 | 0.8987 |
|
| 112 |
+
|
| 113 |
+
|
| 114 |
+
## ☑️ Evaluation Results
|
| 115 |
+
|
| 116 |
+
### ROC Curves and Confusion Matrices
|
| 117 |
+
|
| 118 |
+
Performance across all test datasets (a through l) in the Common Test Sample (More et al. 2024):
|
| 119 |
+
|
| 120 |
+

|
| 121 |
+

|
| 122 |
+

|
| 123 |
+

|
| 124 |
+

|
| 125 |
+

|
| 126 |
+

|
| 127 |
+

|
| 128 |
+

|
| 129 |
+

|
| 130 |
+

|
| 131 |
+

|
| 132 |
+
|
| 133 |
+
### 📋 Performance Summary
|
| 134 |
+
|
| 135 |
+
Average performance across 12 test datasets from the Common Test Sample (More et al. 2024):
|
| 136 |
+
|
| 137 |
+
| Metric | Value |
|
| 138 |
+
|-----------|----------|
|
| 139 |
+
| 🎯 Average Accuracy | 0.6852 |
|
| 140 |
+
| 📈 Average AUC-ROC | 0.7569 |
|
| 141 |
+
| ⚖️ Average F1-Score | 0.4201 |
|
| 142 |
+
|
| 143 |
+
|
| 144 |
+
## 📘 Citation
|
| 145 |
+
|
| 146 |
+
If you use this model in your research, please cite:
|
| 147 |
+
|
| 148 |
+
```bibtex
|
| 149 |
+
@misc{parlange2025gravit,
|
| 150 |
+
title={GraViT: Transfer Learning with Vision Transformers and MLP-Mixer for Strong Gravitational Lens Discovery},
|
| 151 |
+
author={René Parlange and Juan C. Cuevas-Tello and Octavio Valenzuela and Omar de J. Cabrera-Rosas and Tomás Verdugo and Anupreeta More and Anton T. Jaelani},
|
| 152 |
+
year={2025},
|
| 153 |
+
eprint={2509.00226},
|
| 154 |
+
archivePrefix={arXiv},
|
| 155 |
+
primaryClass={cs.CV},
|
| 156 |
+
url={https://arxiv.org/abs/2509.00226},
|
| 157 |
+
}
|
| 158 |
+
```
|
| 159 |
+
|
| 160 |
+
---
|
| 161 |
+
|
| 162 |
+
|
| 163 |
+
## Model Card Contact
|
| 164 |
+
|
| 165 |
+
For questions about this model, please contact the author through: https://github.com/parlange/
|
config.json
ADDED
|
@@ -0,0 +1,76 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architecture": "cvt_13_224",
|
| 3 |
+
"num_classes": 2,
|
| 4 |
+
"num_features": 1000,
|
| 5 |
+
"global_pool": "avg",
|
| 6 |
+
"crop_pct": 0.875,
|
| 7 |
+
"interpolation": "bicubic",
|
| 8 |
+
"mean": [
|
| 9 |
+
0.485,
|
| 10 |
+
0.456,
|
| 11 |
+
0.406
|
| 12 |
+
],
|
| 13 |
+
"std": [
|
| 14 |
+
0.229,
|
| 15 |
+
0.224,
|
| 16 |
+
0.225
|
| 17 |
+
],
|
| 18 |
+
"first_conv": "conv1",
|
| 19 |
+
"classifier": "fc",
|
| 20 |
+
"input_size": [
|
| 21 |
+
3,
|
| 22 |
+
224,
|
| 23 |
+
224
|
| 24 |
+
],
|
| 25 |
+
"pool_size": [
|
| 26 |
+
7,
|
| 27 |
+
7
|
| 28 |
+
],
|
| 29 |
+
"pretrained_cfg": {
|
| 30 |
+
"tag": "gravit_s2",
|
| 31 |
+
"custom_load": false,
|
| 32 |
+
"input_size": [
|
| 33 |
+
3,
|
| 34 |
+
224,
|
| 35 |
+
224
|
| 36 |
+
],
|
| 37 |
+
"fixed_input_size": true,
|
| 38 |
+
"interpolation": "bicubic",
|
| 39 |
+
"crop_pct": 0.875,
|
| 40 |
+
"crop_mode": "center",
|
| 41 |
+
"mean": [
|
| 42 |
+
0.485,
|
| 43 |
+
0.456,
|
| 44 |
+
0.406
|
| 45 |
+
],
|
| 46 |
+
"std": [
|
| 47 |
+
0.229,
|
| 48 |
+
0.224,
|
| 49 |
+
0.225
|
| 50 |
+
],
|
| 51 |
+
"num_classes": 2,
|
| 52 |
+
"pool_size": [
|
| 53 |
+
7,
|
| 54 |
+
7
|
| 55 |
+
],
|
| 56 |
+
"first_conv": "conv1",
|
| 57 |
+
"classifier": "fc"
|
| 58 |
+
},
|
| 59 |
+
"model_name": "cvt_gravit_s2",
|
| 60 |
+
"experiment": "s2",
|
| 61 |
+
"training_strategy": "half",
|
| 62 |
+
"dataset": "C21",
|
| 63 |
+
"hyperparameters": {
|
| 64 |
+
"batch_size": "192",
|
| 65 |
+
"learning_rate": "AdamW with ReduceLROnPlateau",
|
| 66 |
+
"epochs": "100",
|
| 67 |
+
"patience": "10",
|
| 68 |
+
"optimizer": "AdamW",
|
| 69 |
+
"scheduler": "ReduceLROnPlateau",
|
| 70 |
+
"image_size": "224x224",
|
| 71 |
+
"fine_tune_mode": "half",
|
| 72 |
+
"stochastic_depth_probability": "0.1"
|
| 73 |
+
},
|
| 74 |
+
"hf_hub_id": "parlange/cvt-gravit-s2",
|
| 75 |
+
"license": "apache-2.0"
|
| 76 |
+
}
|
confusion_matrices/CvT_Confusion_Matrix_a.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_b.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_c.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_d.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_e.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_f.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_g.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_h.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_i.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_j.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_k.png
ADDED
|
confusion_matrices/CvT_Confusion_Matrix_l.png
ADDED
|
cvt-gravit-s2.pth
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:935c2d8b222977ead0026a69503106db45585d3b04150c95876d200528c37df5
|
| 3 |
+
size 125471131
|
evaluation_results.csv
ADDED
|
@@ -0,0 +1,133 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Model,Dataset,Loss,Accuracy,AUCROC,F1
|
| 2 |
+
ViT,a,0.26081677189425306,0.9402703552342031,0.948634438305709,0.602510460251046
|
| 3 |
+
ViT,b,0.22714076965991073,0.9490726186733731,0.954792817679558,0.64
|
| 4 |
+
ViT,c,0.5409317808831798,0.8802263439170073,0.9180930018416207,0.4304932735426009
|
| 5 |
+
ViT,d,0.2093577366257868,0.9440427538509902,0.9635985267034991,0.6180257510729614
|
| 6 |
+
ViT,e,0.626060025425041,0.8693743139407245,0.9101869371073942,0.7076167076167076
|
| 7 |
+
ViT,f,0.28432284784108347,0.9316861590891488,0.9442099621115129,0.24615384615384617
|
| 8 |
+
ViT,g,0.0920504999384284,0.9766666666666667,0.9990343333333334,0.9770867430441899
|
| 9 |
+
ViT,h,0.2584120511338115,0.9401666666666667,0.9969072222222223,0.9432769789856218
|
| 10 |
+
ViT,i,0.08262255262583494,0.974,0.9991285555555557,0.9745347698334965
|
| 11 |
+
ViT,j,4.850409872978926,0.5285,0.5466985,0.17304881613563286
|
| 12 |
+
ViT,k,4.840981937706471,0.5258333333333334,0.6174096111111111,0.17224323537969158
|
| 13 |
+
ViT,l,1.707986980357526,0.8095817249220031,0.7775850352542502,0.6554396708448952
|
| 14 |
+
MLP-Mixer,a,0.3746531411129437,0.8808550770198051,0.9587697974217311,0.4669479606188467
|
| 15 |
+
MLP-Mixer,b,0.4263077380896289,0.8786545111600126,0.9555377532228361,0.4623955431754875
|
| 16 |
+
MLP-Mixer,c,0.7193417321657543,0.8101226029550456,0.9323139963167587,0.3547008547008547
|
| 17 |
+
MLP-Mixer,d,0.06999178051348941,0.9739075762338887,0.9913977900552486,0.8
|
| 18 |
+
MLP-Mixer,e,0.5372455838482937,0.8397365532381997,0.938428063271021,0.694560669456067
|
| 19 |
+
MLP-Mixer,f,0.40886121419995697,0.8808767717450237,0.9582961898851193,0.17754010695187167
|
| 20 |
+
MLP-Mixer,g,0.220160967502743,0.936,0.9966087777777778,0.9396036489462095
|
| 21 |
+
MLP-Mixer,h,0.3755178214646876,0.8996666666666666,0.9936971666666665,0.9084549878345499
|
| 22 |
+
MLP-Mixer,i,0.03125413155928254,0.9865,0.9996044444444444,0.9866226259289843
|
| 23 |
+
MLP-Mixer,j,4.689226661682129,0.45866666666666667,0.35093755555555556,0.07040641099026904
|
| 24 |
+
MLP-Mixer,k,4.50031984564662,0.5091666666666667,0.5708096111111112,0.0770918207458477
|
| 25 |
+
MLP-Mixer,l,1.7004726856614425,0.7658505631642959,0.7001407272343229,0.5967213114754099
|
| 26 |
+
CvT,a,0.7275567101967556,0.6862621817038667,0.8178130755064457,0.223950233281493
|
| 27 |
+
CvT,b,0.8674648388381725,0.6309336686576549,0.7877642725598526,0.19699042407660738
|
| 28 |
+
CvT,c,0.8444347186157517,0.6369066331342346,0.7903591160220995,0.1995841995841996
|
| 29 |
+
CvT,d,0.06786359480738227,0.9761081420936812,0.984121546961326,0.7912087912087912
|
| 30 |
+
CvT,e,1.079856214224965,0.544456641053787,0.714894422159994,0.4096728307254623
|
| 31 |
+
CvT,f,0.6667535994093697,0.7157462628766168,0.8375527856501153,0.07276402223345124
|
| 32 |
+
CvT,g,0.4902210609912872,0.7958333333333333,0.9368645,0.8262164846077458
|
| 33 |
+
CvT,h,0.4780112521648407,0.799,0.9404210555555556,0.8284495021337127
|
| 34 |
+
CvT,i,0.06629910692572594,0.9788333333333333,0.998182,0.9786590488993446
|
| 35 |
+
CvT,j,3.285769058704376,0.31566666666666665,0.11980044444444445,0.014875239923224568
|
| 36 |
+
CvT,k,2.8618470991551876,0.49866666666666665,0.5598825555555555,0.020195439739413682
|
| 37 |
+
CvT,l,1.3706902678378083,0.6442811062344667,0.5954304635509091,0.4785675529028757
|
| 38 |
+
Swin,a,0.17904385026568925,0.9320968248978309,0.9377541436464087,0.5573770491803278
|
| 39 |
+
Swin,b,0.16908444278234958,0.9302106255894372,0.9438968692449355,0.5506072874493927
|
| 40 |
+
Swin,c,0.3163154192856744,0.8805407104684062,0.9053370165745855,0.4171779141104294
|
| 41 |
+
Swin,d,0.05074489927894287,0.9833385727758567,0.9865469613259669,0.8369230769230769
|
| 42 |
+
Swin,e,0.4723492567023645,0.7859495060373216,0.8619995459017633,0.582441113490364
|
| 43 |
+
Swin,f,0.17172798980294332,0.9313763457516846,0.9387167824732112,0.23488773747841105
|
| 44 |
+
Swin,g,0.07050645374506712,0.9703333333333334,0.9997747777777777,0.971178756476684
|
| 45 |
+
Swin,h,0.14856340130418538,0.944,0.9992891111111111,0.9469529523208083
|
| 46 |
+
Swin,i,0.007766767971217632,0.9985,0.9999895555555555,0.9985017479607124
|
| 47 |
+
Swin,j,2.8674492346346376,0.485,0.43114961111111116,0.05330882352941176
|
| 48 |
+
Swin,k,2.804709552191198,0.5131666666666667,0.6342635555555556,0.056219709208400644
|
| 49 |
+
Swin,l,1.0054080692499547,0.7990587488763153,0.7133887608594007,0.6290511518937915
|
| 50 |
+
CaiT,a,0.4621231952897036,0.821754165356806,0.9297688766114179,0.36363636363636365
|
| 51 |
+
CaiT,b,0.258825070179322,0.9122917321596982,0.9613351749539596,0.5373134328358209
|
| 52 |
+
CaiT,c,0.8353647122552507,0.685004715498271,0.8978766114180479,0.24434389140271492
|
| 53 |
+
CaiT,d,0.031897671438732525,0.9902546369066332,0.9952596685082873,0.9126760563380282
|
| 54 |
+
CaiT,e,0.4524560969171618,0.8463227222832053,0.9379777491864074,0.6982758620689655
|
| 55 |
+
CaiT,f,0.4045827676295477,0.8495081713267756,0.9455966026222479,0.14292015880017644
|
| 56 |
+
CaiT,g,0.12979762570466846,0.9553333333333334,0.9989632222222222,0.9571337172104927
|
| 57 |
+
CaiT,h,0.43545972697343677,0.8348333333333333,0.9963943888888888,0.857921146953405
|
| 58 |
+
CaiT,i,0.009488287813030183,0.9966666666666667,0.9999612222222222,0.9966688874083944
|
| 59 |
+
CaiT,j,3.0854302213191986,0.4766666666666667,0.32313933333333333,0.07100591715976332
|
| 60 |
+
CaiT,k,2.9651209139451384,0.518,0.598769,0.07662835249042145
|
| 61 |
+
CaiT,l,1.2156231775795154,0.7445402146898631,0.6535540584867622,0.5754459970120397
|
| 62 |
+
DeiT,a,0.21973140770554805,0.9163784973278843,0.9421279926335173,0.5333333333333333
|
| 63 |
+
DeiT,b,0.18026995446669986,0.9349261238604213,0.9484125230202579,0.5949119373776908
|
| 64 |
+
DeiT,c,0.24070323096322999,0.9154353976736875,0.9384106813996317,0.5305410122164049
|
| 65 |
+
DeiT,d,0.045521379231525945,0.9877397044954417,0.9928232044198896,0.8862973760932945
|
| 66 |
+
DeiT,e,0.6630487046833227,0.7760702524698134,0.8773329296904565,0.5984251968503937
|
| 67 |
+
DeiT,f,0.1809546241546691,0.9326930524359074,0.9509643553098133,0.2591645353793691
|
| 68 |
+
DeiT,g,0.08131901465170085,0.9685,0.9991698888888889,0.9693530079455165
|
| 69 |
+
DeiT,h,0.11335872827284038,0.9581666666666667,0.9988661111111111,0.9597046074811366
|
| 70 |
+
DeiT,i,0.009879813833162188,0.9965,0.9999472222222223,0.9964994165694282
|
| 71 |
+
DeiT,j,3.730827052116394,0.49016666666666664,0.3371602222222222,0.07218683651804671
|
| 72 |
+
DeiT,k,3.6593878800719977,0.5181666666666667,0.7405100555555556,0.07606263982102908
|
| 73 |
+
DeiT,l,1.2843114430264382,0.8011210406641637,0.708836413391112,0.634179554518043
|
| 74 |
+
DeiT3,a,0.10351562584202569,0.9666771455517132,0.967061694290976,0.7309644670050761
|
| 75 |
+
DeiT3,b,0.10786095303265189,0.9651053127947187,0.9586629834254143,0.7218045112781954
|
| 76 |
+
DeiT3,c,0.12137072830127493,0.9622760138321282,0.9632596685082873,0.7058823529411765
|
| 77 |
+
DeiT3,d,0.05938508253192066,0.9852247720842502,0.9764714548802946,0.8597014925373134
|
| 78 |
+
DeiT3,e,0.2852305471569987,0.9198682766190999,0.9255581624158026,0.7977839335180056
|
| 79 |
+
DeiT3,f,0.06806116103929771,0.9760669196808922,0.9640239483015283,0.4824120603015075
|
| 80 |
+
DeiT3,g,0.03457943443208933,0.9858333333333333,0.9997122222222222,0.9859805376876134
|
| 81 |
+
DeiT3,h,0.04174186686426401,0.9843333333333333,0.99965,0.9845191040843215
|
| 82 |
+
DeiT3,i,0.00887914503365755,0.9965,0.9999582222222222,0.9964994165694282
|
| 83 |
+
DeiT3,j,3.737933710604906,0.519,0.5266332222222222,0.11526670754138565
|
| 84 |
+
DeiT3,k,3.71223342192173,0.5296666666666666,0.6497483333333334,0.11757348342714197
|
| 85 |
+
DeiT3,l,1.2237240775665559,0.8343821056527947,0.8068982843173076,0.6795580110497238
|
| 86 |
+
Twins_SVT,a,0.42038975148709157,0.8192392329456146,0.8869475138121546,0.33983926521239954
|
| 87 |
+
Twins_SVT,b,0.4005945102762254,0.8091795033008488,0.8857955801104973,0.327796234772979
|
| 88 |
+
Twins_SVT,c,0.4493913765093771,0.8054071046840616,0.87690423572744,0.32349726775956283
|
| 89 |
+
Twins_SVT,d,0.10364232180450968,0.9761081420936812,0.9843406998158379,0.7956989247311828
|
| 90 |
+
Twins_SVT,e,0.5988346448190649,0.70801317233809,0.8065163096949973,0.5266903914590747
|
| 91 |
+
Twins_SVT,f,0.3545602475236788,0.8442413445898846,0.9026489390789582,0.1283051582141309
|
| 92 |
+
Twins_SVT,g,0.2474187276363373,0.8956666666666667,0.9830134444444444,0.9040171726464274
|
| 93 |
+
Twins_SVT,h,0.273289222240448,0.8936666666666667,0.9851210555555556,0.9023569023569024
|
| 94 |
+
Twins_SVT,i,0.0899845923781395,0.9841666666666666,0.9985422222222222,0.9841428809881488
|
| 95 |
+
Twins_SVT,j,2.6232716159820555,0.4155,0.1803668888888889,0.03680307607800055
|
| 96 |
+
Twins_SVT,k,2.4658374714255333,0.504,0.3982547777777778,0.043086816720257236
|
| 97 |
+
Twins_SVT,l,1.027051387313659,0.7358151340489663,0.5966640795291248,0.5587352057940294
|
| 98 |
+
Twins_PCPVT,a,0.430052495628586,0.8000628733102798,0.8719401473296501,0.3041575492341357
|
| 99 |
+
Twins_PCPVT,b,0.3465015141235437,0.8560201194592896,0.9034383057090241,0.37771739130434784
|
| 100 |
+
Twins_PCPVT,c,0.4657778265396779,0.7752279157497642,0.857121546961326,0.2799597180261833
|
| 101 |
+
Twins_PCPVT,d,0.2879813692922751,0.9056900345803206,0.9308342541436465,0.4809688581314879
|
| 102 |
+
Twins_PCPVT,e,0.4774637080311906,0.7716794731064764,0.8551956406569287,0.5720164609053497
|
| 103 |
+
Twins_PCPVT,f,0.3843120121954025,0.8335527844473705,0.8887894780242434,0.11454470539761022
|
| 104 |
+
Twins_PCPVT,g,0.25515789008140566,0.9081666666666667,0.9717176666666667,0.9122751154274797
|
| 105 |
+
Twins_PCPVT,h,0.31839424538612365,0.8653333333333333,0.9604102222222224,0.8764148057509942
|
| 106 |
+
Twins_PCPVT,i,0.2241324300765991,0.9345,0.9829576666666666,0.9358157765801078
|
| 107 |
+
Twins_PCPVT,j,1.4245353891849517,0.49833333333333335,0.46813805555555554,0.21245421245421245
|
| 108 |
+
Twins_PCPVT,k,1.3935099244117737,0.5246666666666666,0.4594932222222222,0.22161572052401746
|
| 109 |
+
Twins_PCPVT,l,0.6877527087257096,0.7420548886891227,0.697346002300591,0.5830056419900838
|
| 110 |
+
PiT,a,0.45982032410006446,0.7887456774599183,0.924538674033149,0.3225806451612903
|
| 111 |
+
PiT,b,0.4223166468832261,0.8010059729644766,0.9304309392265193,0.3357817418677859
|
| 112 |
+
PiT,c,0.6344863635794392,0.7136120716755737,0.8947255985267035,0.25995125913891143
|
| 113 |
+
PiT,d,0.03659489378333092,0.9845960389814524,0.9966464088397791,0.8672086720867209
|
| 114 |
+
PiT,e,0.6467190528151494,0.7200878155872668,0.8809127374555362,0.5565217391304348
|
| 115 |
+
PiT,f,0.41419300931409125,0.8113236774843157,0.93339286411791,0.11611030478955008
|
| 116 |
+
PiT,g,0.22269588689506054,0.8966666666666666,0.9961491111111112,0.9061175045427013
|
| 117 |
+
PiT,h,0.3351811931580305,0.8503333333333334,0.9926552222222224,0.8695146759662888
|
| 118 |
+
PiT,i,0.018199062839150428,0.994,0.999871,0.9940199335548173
|
| 119 |
+
PiT,j,4.0110640263557436,0.4066666666666667,0.19481644444444443,0.028384279475982533
|
| 120 |
+
PiT,k,3.806567204385996,0.504,0.6367970000000001,0.033766233766233764
|
| 121 |
+
PiT,l,1.4887937803340863,0.7148749405108138,0.6321520235401981,0.5430508474576271
|
| 122 |
+
Ensemble,a,,0.9481295190191764,0.9716399631675874,0.656964656964657
|
| 123 |
+
Ensemble,b,,0.945300220056586,0.973926335174954,0.6448979591836734
|
| 124 |
+
Ensemble,c,,0.9022320025149324,0.9555580110497237,0.5039872408293461
|
| 125 |
+
Ensemble,d,,0.9905690034580321,0.9966666666666667,0.9132947976878613
|
| 126 |
+
Ensemble,e,,0.8737650933040615,0.9450768182850223,0.7331786542923434
|
| 127 |
+
Ensemble,f,,0.9455503059406708,0.9727634725471218,0.310107948969578
|
| 128 |
+
Ensemble,g,,0.9748333333333333,0.9997657777777778,0.9754511461550968
|
| 129 |
+
Ensemble,h,,0.952,0.9993787777777778,0.9541984732824428
|
| 130 |
+
Ensemble,i,,0.9988333333333334,0.999993888888889,0.9988346928583319
|
| 131 |
+
Ensemble,j,,0.4875,0.25661944444444446,0.04710257204834211
|
| 132 |
+
Ensemble,k,,0.5115,0.4922346666666667,0.04930262731106066
|
| 133 |
+
Ensemble,l,,0.8082068637301042,0.6421131491191425,0.6407132243684993
|
model.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:46c041bb0d95fabb8d6d9c3311a162287f1f02997a0caa74a3389c496f376a33
|
| 3 |
+
size 125239576
|
pytorch_model.bin
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:935c2d8b222977ead0026a69503106db45585d3b04150c95876d200528c37df5
|
| 3 |
+
size 125471131
|
roc_confusion_matrix/CvT_roc_confusion_matrix_a.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_b.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_c.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_d.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_e.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_f.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_g.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_h.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_i.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_j.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_k.png
ADDED
|
roc_confusion_matrix/CvT_roc_confusion_matrix_l.png
ADDED
|
roc_curves/CvT_ROC_a.png
ADDED
|
roc_curves/CvT_ROC_b.png
ADDED
|
roc_curves/CvT_ROC_c.png
ADDED
|
roc_curves/CvT_ROC_d.png
ADDED
|
roc_curves/CvT_ROC_e.png
ADDED
|
roc_curves/CvT_ROC_f.png
ADDED
|
roc_curves/CvT_ROC_g.png
ADDED
|
roc_curves/CvT_ROC_h.png
ADDED
|
roc_curves/CvT_ROC_i.png
ADDED
|
roc_curves/CvT_ROC_j.png
ADDED
|
roc_curves/CvT_ROC_k.png
ADDED
|
roc_curves/CvT_ROC_l.png
ADDED
|
training_curves/CvT_accuracy.png
ADDED
|
training_curves/CvT_auc.png
ADDED
|
training_curves/CvT_combined_metrics.png
ADDED
|
Git LFS Details
|
training_curves/CvT_f1.png
ADDED
|
training_curves/CvT_loss.png
ADDED
|
training_curves/CvT_metrics.csv
ADDED
|
@@ -0,0 +1,45 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
|
| 2 |
+
1,0.5810069377974298,0.5488423731327057,0.6605037513397642,0.707,0.7442277688172969,0.817224,0.6562483043030007,0.7404782993799823
|
| 3 |
+
2,0.44997175542871287,0.46011225509643555,0.7554662379421222,0.768,0.8611716101524545,0.888134,0.7574936224489796,0.7932263814616756
|
| 4 |
+
3,0.4186917702293089,0.37896743416786194,0.7713022508038585,0.829,0.8799081450196384,0.9152499999999999,0.7756132187081,0.8372978116079924
|
| 5 |
+
4,0.4023784536257434,0.35902694511413574,0.7809753483386924,0.847,0.8885943087735744,0.934978,0.7848041280539174,0.8589861751152074
|
| 6 |
+
5,0.392730695105059,0.3683323709964752,0.7828242229367631,0.83,0.8931283723860955,0.9374279999999999,0.7838378450460062,0.8473967684021544
|
| 7 |
+
6,0.38261979235713506,0.330795640707016,0.7921221864951768,0.863,0.8992497176874148,0.9402020000000001,0.795895816890292,0.8706326723323891
|
| 8 |
+
7,0.3742205956357852,0.34796459555625914,0.7979903536977492,0.846,0.9044855032631085,0.9398340000000001,0.80232832533627,0.8576709796672828
|
| 9 |
+
8,0.36936824618811775,0.35536236000061033,0.7991157556270096,0.843,0.906306558727336,0.9348200000000001,0.8024557982661853,0.8536812674743709
|
| 10 |
+
9,0.3621403247980443,0.2982906231880188,0.804983922829582,0.875,0.9106292713061279,0.951416,0.8083728278041075,0.8812915479582146
|
| 11 |
+
10,0.36013111201896547,0.3295904839038849,0.8042068595927117,0.863,0.9113218303160637,0.937846,0.8059899636247776,0.8678881388621023
|
| 12 |
+
11,0.3544689358622315,0.3333071744441986,0.8099142550911039,0.859,0.9148217553581952,0.9414099999999999,0.8105032588951812,0.8671065032987747
|
| 13 |
+
12,0.35769600682319935,0.2905820226669312,0.8037781350482315,0.879,0.9119931885985922,0.955086,0.8015070609600737,0.8855250709555346
|
| 14 |
+
13,0.34610839990174275,0.29659646236896514,0.8141747052518756,0.881,0.9185031961920254,0.951722,0.8152883206818484,0.8861244019138756
|
| 15 |
+
14,0.3476408286876617,0.27957506382465364,0.8106377277599143,0.887,0.9169516185730089,0.955234,0.8111589129679608,0.8903976721629486
|
| 16 |
+
15,0.3429360215111944,0.282425630569458,0.8120578778135048,0.883,0.9190525928529827,0.9546779999999999,0.8127602776294715,0.8865179437439379
|
| 17 |
+
16,0.3383608748100195,0.27267832922935487,0.8131028938906752,0.882,0.9209398108419521,0.9545380000000001,0.8142328281886702,0.8805668016194332
|
| 18 |
+
17,0.34067110904160036,0.2762571161985397,0.8102625937834941,0.878,0.9197575227831714,0.9536240000000001,0.8129045895315349,0.8784860557768924
|
| 19 |
+
18,0.3386077652406846,0.27553155159950254,0.8138799571275456,0.88,0.9208016528640799,0.957288,0.815147966787311,0.884393063583815
|
| 20 |
+
19,0.33532541957698836,0.2967168786525726,0.817524115755627,0.878,0.9228272442270954,0.954898,0.8190177527373232,0.8838095238095238
|
| 21 |
+
20,0.33647846801871273,0.2837833144664764,0.8193997856377278,0.887,0.9231991415170094,0.9583780000000001,0.8198342689120556,0.8914505283381364
|
| 22 |
+
21,0.33475628357224907,0.28021864235401156,0.817443729903537,0.882,0.9232771867306766,0.9580420000000001,0.8183151550708019,0.8874045801526718
|
| 23 |
+
22,0.3317486240932796,0.275274400472641,0.8180600214362272,0.889,0.9242578926902005,0.9577439999999999,0.8156193993374247,0.892128279883382
|
| 24 |
+
23,0.3271151032286825,0.2775604016780853,0.8197481243301179,0.88,0.9259947986131932,0.956974,0.8167079916078581,0.884393063583815
|
| 25 |
+
24,0.3243635906475533,0.24242133009433747,0.8238745980707396,0.91,0.9281524766539268,0.9651259999999999,0.8226628895184136,0.9105367793240556
|
| 26 |
+
25,0.32931717041987696,0.2727523832321167,0.8185959271168275,0.878,0.9251302141669797,0.9598240000000001,0.8172739541160594,0.882466281310212
|
| 27 |
+
26,0.32261106762855385,0.25530711317062377,0.8246248660235799,0.893,0.9286778063823897,0.9614119999999999,0.82436602710318,0.8949950932286556
|
| 28 |
+
27,0.32507631826630745,0.271099805355072,0.8219453376205788,0.887,0.9270628850565601,0.959052,0.8241365621278285,0.8906098741529526
|
| 29 |
+
28,0.32730844655220914,0.24925938618183135,0.8191318327974276,0.896,0.9257626991840218,0.963342,0.822284239903112,0.8974358974358975
|
| 30 |
+
29,0.3232409065559363,0.27019736969470975,0.8252947481243301,0.889,0.9286461445808045,0.958164,0.8293908310655222,0.8919182083739046
|
| 31 |
+
30,0.31966163585040347,0.27033305168151855,0.82497320471597,0.89,0.9298085013136295,0.957322,0.8245783650230959,0.8923679060665362
|
| 32 |
+
31,0.3227184886526065,0.25729180419445036,0.8219453376205788,0.889,0.9282415386639005,0.96024,0.8253751346805771,0.8906403940886699
|
| 33 |
+
32,0.32393497426026885,0.26746924781799314,0.8223740621650589,0.887,0.9276935377701498,0.959782,0.8256122905321863,0.8899707887049659
|
| 34 |
+
33,0.3178728149443194,0.2603160631656647,0.8279474812433012,0.888,0.9308012536631709,0.9620060000000001,0.8303073548455298,0.8916827852998066
|
| 35 |
+
34,0.31903316974639895,0.2615310963392258,0.8244372990353698,0.893,0.9296174793707905,0.959944,0.8256426632604184,0.8945812807881773
|
| 36 |
+
35,0.3216844116376527,0.2946858558654785,0.8233922829581993,0.884,0.9286307107269594,0.9549899999999999,0.8241790487368954,0.8888888888888888
|
| 37 |
+
36,0.32555744155426886,0.24786870670318603,0.8229635584137192,0.904,0.9272817232612933,0.9630860000000001,0.8230062417959227,0.904572564612326
|
| 38 |
+
37,0.3198907693482672,0.25034555172920225,0.8254555198285102,0.904,0.929600132798921,0.96349,0.825174449812131,0.9058823529411765
|
| 39 |
+
38,0.32117122620631644,0.2737807540893555,0.8222668810289389,0.878,0.9286592406681762,0.958818,0.8230020013342229,0.8826923076923077
|
| 40 |
+
39,0.3230648726704036,0.2672017843723297,0.8208735262593784,0.89,0.9275104553302799,0.960638,0.8218805787215901,0.8932038834951457
|
| 41 |
+
40,0.31986991155569194,0.2768093681335449,0.8238745980707396,0.881,0.9293863463524524,0.957876,0.8238981915606162,0.8848015488867377
|
| 42 |
+
41,0.31857546941643744,0.2584452579021454,0.8280278670953912,0.891,0.930821914464169,0.95993,0.8273431615194232,0.8921859545004945
|
| 43 |
+
42,0.3232113939581193,0.26418426156044006,0.8235262593783494,0.883,0.9284273738668725,0.962722,0.822441496818721,0.8882521489971347
|
| 44 |
+
43,0.31843074629161133,0.2844834225177765,0.8257502679528403,0.876,0.9304397198011692,0.956348,0.8254649883249685,0.8807692307692307
|
| 45 |
+
44,0.32233591386359606,0.24970791351795196,0.8233118971061093,0.897,0.9284944467994426,0.963794,0.8231981981981982,0.8987217305801377
|
training_metrics.csv
ADDED
|
@@ -0,0 +1,45 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
epoch,train_loss,val_loss,train_accuracy,val_accuracy,train_auc,val_auc,train_f1,val_f1
|
| 2 |
+
1,0.5810069377974298,0.5488423731327057,0.6605037513397642,0.707,0.7442277688172969,0.817224,0.6562483043030007,0.7404782993799823
|
| 3 |
+
2,0.44997175542871287,0.46011225509643555,0.7554662379421222,0.768,0.8611716101524545,0.888134,0.7574936224489796,0.7932263814616756
|
| 4 |
+
3,0.4186917702293089,0.37896743416786194,0.7713022508038585,0.829,0.8799081450196384,0.9152499999999999,0.7756132187081,0.8372978116079924
|
| 5 |
+
4,0.4023784536257434,0.35902694511413574,0.7809753483386924,0.847,0.8885943087735744,0.934978,0.7848041280539174,0.8589861751152074
|
| 6 |
+
5,0.392730695105059,0.3683323709964752,0.7828242229367631,0.83,0.8931283723860955,0.9374279999999999,0.7838378450460062,0.8473967684021544
|
| 7 |
+
6,0.38261979235713506,0.330795640707016,0.7921221864951768,0.863,0.8992497176874148,0.9402020000000001,0.795895816890292,0.8706326723323891
|
| 8 |
+
7,0.3742205956357852,0.34796459555625914,0.7979903536977492,0.846,0.9044855032631085,0.9398340000000001,0.80232832533627,0.8576709796672828
|
| 9 |
+
8,0.36936824618811775,0.35536236000061033,0.7991157556270096,0.843,0.906306558727336,0.9348200000000001,0.8024557982661853,0.8536812674743709
|
| 10 |
+
9,0.3621403247980443,0.2982906231880188,0.804983922829582,0.875,0.9106292713061279,0.951416,0.8083728278041075,0.8812915479582146
|
| 11 |
+
10,0.36013111201896547,0.3295904839038849,0.8042068595927117,0.863,0.9113218303160637,0.937846,0.8059899636247776,0.8678881388621023
|
| 12 |
+
11,0.3544689358622315,0.3333071744441986,0.8099142550911039,0.859,0.9148217553581952,0.9414099999999999,0.8105032588951812,0.8671065032987747
|
| 13 |
+
12,0.35769600682319935,0.2905820226669312,0.8037781350482315,0.879,0.9119931885985922,0.955086,0.8015070609600737,0.8855250709555346
|
| 14 |
+
13,0.34610839990174275,0.29659646236896514,0.8141747052518756,0.881,0.9185031961920254,0.951722,0.8152883206818484,0.8861244019138756
|
| 15 |
+
14,0.3476408286876617,0.27957506382465364,0.8106377277599143,0.887,0.9169516185730089,0.955234,0.8111589129679608,0.8903976721629486
|
| 16 |
+
15,0.3429360215111944,0.282425630569458,0.8120578778135048,0.883,0.9190525928529827,0.9546779999999999,0.8127602776294715,0.8865179437439379
|
| 17 |
+
16,0.3383608748100195,0.27267832922935487,0.8131028938906752,0.882,0.9209398108419521,0.9545380000000001,0.8142328281886702,0.8805668016194332
|
| 18 |
+
17,0.34067110904160036,0.2762571161985397,0.8102625937834941,0.878,0.9197575227831714,0.9536240000000001,0.8129045895315349,0.8784860557768924
|
| 19 |
+
18,0.3386077652406846,0.27553155159950254,0.8138799571275456,0.88,0.9208016528640799,0.957288,0.815147966787311,0.884393063583815
|
| 20 |
+
19,0.33532541957698836,0.2967168786525726,0.817524115755627,0.878,0.9228272442270954,0.954898,0.8190177527373232,0.8838095238095238
|
| 21 |
+
20,0.33647846801871273,0.2837833144664764,0.8193997856377278,0.887,0.9231991415170094,0.9583780000000001,0.8198342689120556,0.8914505283381364
|
| 22 |
+
21,0.33475628357224907,0.28021864235401156,0.817443729903537,0.882,0.9232771867306766,0.9580420000000001,0.8183151550708019,0.8874045801526718
|
| 23 |
+
22,0.3317486240932796,0.275274400472641,0.8180600214362272,0.889,0.9242578926902005,0.9577439999999999,0.8156193993374247,0.892128279883382
|
| 24 |
+
23,0.3271151032286825,0.2775604016780853,0.8197481243301179,0.88,0.9259947986131932,0.956974,0.8167079916078581,0.884393063583815
|
| 25 |
+
24,0.3243635906475533,0.24242133009433747,0.8238745980707396,0.91,0.9281524766539268,0.9651259999999999,0.8226628895184136,0.9105367793240556
|
| 26 |
+
25,0.32931717041987696,0.2727523832321167,0.8185959271168275,0.878,0.9251302141669797,0.9598240000000001,0.8172739541160594,0.882466281310212
|
| 27 |
+
26,0.32261106762855385,0.25530711317062377,0.8246248660235799,0.893,0.9286778063823897,0.9614119999999999,0.82436602710318,0.8949950932286556
|
| 28 |
+
27,0.32507631826630745,0.271099805355072,0.8219453376205788,0.887,0.9270628850565601,0.959052,0.8241365621278285,0.8906098741529526
|
| 29 |
+
28,0.32730844655220914,0.24925938618183135,0.8191318327974276,0.896,0.9257626991840218,0.963342,0.822284239903112,0.8974358974358975
|
| 30 |
+
29,0.3232409065559363,0.27019736969470975,0.8252947481243301,0.889,0.9286461445808045,0.958164,0.8293908310655222,0.8919182083739046
|
| 31 |
+
30,0.31966163585040347,0.27033305168151855,0.82497320471597,0.89,0.9298085013136295,0.957322,0.8245783650230959,0.8923679060665362
|
| 32 |
+
31,0.3227184886526065,0.25729180419445036,0.8219453376205788,0.889,0.9282415386639005,0.96024,0.8253751346805771,0.8906403940886699
|
| 33 |
+
32,0.32393497426026885,0.26746924781799314,0.8223740621650589,0.887,0.9276935377701498,0.959782,0.8256122905321863,0.8899707887049659
|
| 34 |
+
33,0.3178728149443194,0.2603160631656647,0.8279474812433012,0.888,0.9308012536631709,0.9620060000000001,0.8303073548455298,0.8916827852998066
|
| 35 |
+
34,0.31903316974639895,0.2615310963392258,0.8244372990353698,0.893,0.9296174793707905,0.959944,0.8256426632604184,0.8945812807881773
|
| 36 |
+
35,0.3216844116376527,0.2946858558654785,0.8233922829581993,0.884,0.9286307107269594,0.9549899999999999,0.8241790487368954,0.8888888888888888
|
| 37 |
+
36,0.32555744155426886,0.24786870670318603,0.8229635584137192,0.904,0.9272817232612933,0.9630860000000001,0.8230062417959227,0.904572564612326
|
| 38 |
+
37,0.3198907693482672,0.25034555172920225,0.8254555198285102,0.904,0.929600132798921,0.96349,0.825174449812131,0.9058823529411765
|
| 39 |
+
38,0.32117122620631644,0.2737807540893555,0.8222668810289389,0.878,0.9286592406681762,0.958818,0.8230020013342229,0.8826923076923077
|
| 40 |
+
39,0.3230648726704036,0.2672017843723297,0.8208735262593784,0.89,0.9275104553302799,0.960638,0.8218805787215901,0.8932038834951457
|
| 41 |
+
40,0.31986991155569194,0.2768093681335449,0.8238745980707396,0.881,0.9293863463524524,0.957876,0.8238981915606162,0.8848015488867377
|
| 42 |
+
41,0.31857546941643744,0.2584452579021454,0.8280278670953912,0.891,0.930821914464169,0.95993,0.8273431615194232,0.8921859545004945
|
| 43 |
+
42,0.3232113939581193,0.26418426156044006,0.8235262593783494,0.883,0.9284273738668725,0.962722,0.822441496818721,0.8882521489971347
|
| 44 |
+
43,0.31843074629161133,0.2844834225177765,0.8257502679528403,0.876,0.9304397198011692,0.956348,0.8254649883249685,0.8807692307692307
|
| 45 |
+
44,0.32233591386359606,0.24970791351795196,0.8233118971061093,0.897,0.9284944467994426,0.963794,0.8231981981981982,0.8987217305801377
|