joheras commited on
Commit
0ccc2a0
·
1 Parent(s): 18e9dfa

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +36 -42
README.md CHANGED
@@ -7,11 +7,6 @@ metrics:
7
  model-index:
8
  - name: marimari-r2r-mlsum-clara-med
9
  results: []
10
- license: cc-by-nc-sa-4.0
11
- datasets:
12
- - lcampillos/CLARA-MeD
13
- language:
14
- - es
15
  ---
16
 
17
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -19,14 +14,13 @@ should probably proofread and complete it, then remove this comment. -->
19
 
20
  # marimari-r2r-mlsum-clara-med
21
 
22
- This model is a fine-tuned version of [IIC/marimari-r2r-mlsum](https://huggingface.co/IIC/marimari-r2r-mlsum) on the [CLARA-MeD](https://huggingface.co/lcampillos/CLARA-MeD) dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 3.9276
25
- - Rouge1: 43.1543
26
- - Rouge2: 24.9453
27
- - Rougel: 37.4907
28
- - Rougelsum: 37.6959
29
- - SARI: 47.7046
30
 
31
  ## Model description
32
 
@@ -57,36 +51,36 @@ The following hyperparameters were used during training:
57
 
58
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
59
  |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
60
- | No log | 1.0 | 190 | 2.3405 | 42.5983 | 25.1109 | 37.6327 | 37.7257 |
61
- | No log | 2.0 | 380 | 2.2954 | 41.9792 | 23.9766 | 36.3881 | 36.5226 |
62
- | 1.976 | 3.0 | 570 | 2.4302 | 42.0317 | 23.9135 | 36.186 | 36.3812 |
63
- | 1.976 | 4.0 | 760 | 2.7029 | 41.7418 | 23.7318 | 36.2048 | 36.3403 |
64
- | 0.6481 | 5.0 | 950 | 2.9547 | 41.2054 | 22.9037 | 35.1364 | 35.3168 |
65
- | 0.6481 | 6.0 | 1140 | 3.1709 | 41.0444 | 23.2019 | 35.7206 | 35.8829 |
66
- | 0.6481 | 7.0 | 1330 | 3.2556 | 41.3295 | 22.6827 | 35.2777 | 35.4701 |
67
- | 0.1485 | 8.0 | 1520 | 3.3117 | 41.068 | 22.965 | 35.5507 | 35.6491 |
68
- | 0.1485 | 9.0 | 1710 | 3.4171 | 41.5945 | 23.6423 | 35.8899 | 36.0442 |
69
- | 0.0725 | 10.0 | 1900 | 3.4981 | 41.1163 | 23.0651 | 35.6205 | 35.6596 |
70
- | 0.0725 | 11.0 | 2090 | 3.5086 | 40.9784 | 22.9125 | 35.182 | 35.5205 |
71
- | 0.0725 | 12.0 | 2280 | 3.5503 | 41.6038 | 23.3975 | 36.0071 | 36.2095 |
72
- | 0.0425 | 13.0 | 2470 | 3.6113 | 42.0039 | 24.0294 | 36.4882 | 36.6313 |
73
- | 0.0425 | 14.0 | 2660 | 3.6253 | 41.3012 | 23.1452 | 35.5444 | 35.761 |
74
- | 0.0291 | 15.0 | 2850 | 3.6247 | 42.1477 | 24.3389 | 36.4346 | 36.6004 |
75
- | 0.0291 | 16.0 | 3040 | 3.6683 | 42.6205 | 24.3544 | 36.776 | 36.9848 |
76
- | 0.0291 | 17.0 | 3230 | 3.7544 | 41.9877 | 24.069 | 36.6296 | 36.9115 |
77
- | 0.0166 | 18.0 | 3420 | 3.7562 | 41.8586 | 23.6088 | 36.271 | 36.4634 |
78
- | 0.0166 | 19.0 | 3610 | 3.7687 | 43.2161 | 25.0204 | 37.5484 | 37.759 |
79
- | 0.0088 | 20.0 | 3800 | 3.7907 | 42.8482 | 24.8476 | 37.1841 | 37.4456 |
80
- | 0.0088 | 21.0 | 3990 | 3.8260 | 42.3613 | 24.3827 | 36.6921 | 36.8898 |
81
- | 0.0088 | 22.0 | 4180 | 3.8367 | 42.6367 | 24.6803 | 37.0963 | 37.3301 |
82
- | 0.0039 | 23.0 | 4370 | 3.8613 | 42.8326 | 25.0972 | 37.4584 | 37.6063 |
83
- | 0.0039 | 24.0 | 4560 | 3.8716 | 43.043 | 24.7042 | 37.4917 | 37.6845 |
84
- | 0.0028 | 25.0 | 4750 | 3.8881 | 42.9107 | 25.0261 | 37.3744 | 37.6019 |
85
- | 0.0028 | 26.0 | 4940 | 3.9005 | 42.8922 | 24.8232 | 37.4217 | 37.5928 |
86
- | 0.0028 | 27.0 | 5130 | 3.9054 | 43.1217 | 25.1892 | 37.6801 | 37.8118 |
87
- | 0.0017 | 28.0 | 5320 | 3.9159 | 43.3466 | 25.1834 | 37.7026 | 37.9333 |
88
- | 0.0017 | 29.0 | 5510 | 3.9240 | 43.1974 | 25.0535 | 37.6958 | 37.9008 |
89
- | 0.0012 | 30.0 | 5700 | 3.9276 | 43.1543 | 24.9453 | 37.4907 | 37.6959 |
90
 
91
 
92
  ### Framework versions
 
7
  model-index:
8
  - name: marimari-r2r-mlsum-clara-med
9
  results: []
 
 
 
 
 
10
  ---
11
 
12
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
14
 
15
  # marimari-r2r-mlsum-clara-med
16
 
17
+ This model is a fine-tuned version of [IIC/marimari-r2r-mlsum](https://huggingface.co/IIC/marimari-r2r-mlsum) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 4.0687
20
+ - Rouge1: 41.588
21
+ - Rouge2: 23.5096
22
+ - Rougel: 35.9281
23
+ - Rougelsum: 36.128
 
24
 
25
  ## Model description
26
 
 
51
 
52
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
53
  |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
54
+ | No log | 1.0 | 190 | 2.3804 | 41.7652 | 24.3329 | 36.7515 | 36.8694 |
55
+ | No log | 2.0 | 380 | 2.3417 | 41.586 | 23.679 | 35.8653 | 36.065 |
56
+ | 1.9588 | 3.0 | 570 | 2.4777 | 40.954 | 23.0312 | 35.4667 | 35.7321 |
57
+ | 1.9588 | 4.0 | 760 | 2.8230 | 41.387 | 23.4641 | 35.9558 | 36.1302 |
58
+ | 0.6288 | 5.0 | 950 | 3.0836 | 40.7233 | 22.4476 | 34.852 | 35.1656 |
59
+ | 0.6288 | 6.0 | 1140 | 3.2078 | 40.8334 | 22.4327 | 35.0436 | 35.2623 |
60
+ | 0.6288 | 7.0 | 1330 | 3.3649 | 40.6737 | 22.4294 | 34.5433 | 34.9343 |
61
+ | 0.1473 | 8.0 | 1520 | 3.4503 | 40.8818 | 22.6808 | 34.6777 | 34.9179 |
62
+ | 0.1473 | 9.0 | 1710 | 3.5140 | 40.4208 | 22.2582 | 34.5103 | 34.8161 |
63
+ | 0.0706 | 10.0 | 1900 | 3.5805 | 40.6348 | 22.4714 | 34.6782 | 34.9531 |
64
+ | 0.0706 | 11.0 | 2090 | 3.6325 | 40.932 | 22.4958 | 34.7695 | 35.0314 |
65
+ | 0.0706 | 12.0 | 2280 | 3.6405 | 40.619 | 22.406 | 34.9997 | 35.3007 |
66
+ | 0.0401 | 13.0 | 2470 | 3.7279 | 40.7365 | 22.2549 | 34.6789 | 34.9794 |
67
+ | 0.0401 | 14.0 | 2660 | 3.7440 | 41.1684 | 23.1526 | 35.4117 | 35.7039 |
68
+ | 0.0277 | 15.0 | 2850 | 3.8185 | 41.3103 | 23.52 | 35.5945 | 35.9176 |
69
+ | 0.0277 | 16.0 | 3040 | 3.8215 | 40.5096 | 22.3435 | 34.7064 | 35.0037 |
70
+ | 0.0277 | 17.0 | 3230 | 3.8925 | 41.3644 | 23.2827 | 35.3861 | 35.6649 |
71
+ | 0.0172 | 18.0 | 3420 | 3.8594 | 41.6572 | 23.574 | 35.5946 | 35.8383 |
72
+ | 0.0172 | 19.0 | 3610 | 3.9191 | 41.4862 | 23.2408 | 35.5638 | 35.7895 |
73
+ | 0.0087 | 20.0 | 3800 | 3.8776 | 41.8812 | 23.6052 | 35.9983 | 36.2555 |
74
+ | 0.0087 | 21.0 | 3990 | 3.9526 | 42.0435 | 23.6758 | 36.0881 | 36.3736 |
75
+ | 0.0087 | 22.0 | 4180 | 3.9847 | 41.7187 | 23.5729 | 36.0514 | 36.3019 |
76
+ | 0.0036 | 23.0 | 4370 | 3.9939 | 41.6098 | 23.3451 | 35.779 | 35.9889 |
77
+ | 0.0036 | 24.0 | 4560 | 4.0194 | 41.1443 | 23.1271 | 35.6529 | 35.808 |
78
+ | 0.002 | 25.0 | 4750 | 4.0231 | 41.5422 | 23.5603 | 35.8412 | 36.0677 |
79
+ | 0.002 | 26.0 | 4940 | 4.0439 | 41.5561 | 23.5496 | 35.8154 | 36.0846 |
80
+ | 0.002 | 27.0 | 5130 | 4.0554 | 41.6566 | 23.4052 | 35.8392 | 36.0672 |
81
+ | 0.0014 | 28.0 | 5320 | 4.0610 | 41.6654 | 23.5138 | 35.9715 | 36.1973 |
82
+ | 0.0014 | 29.0 | 5510 | 4.0658 | 41.5467 | 23.464 | 35.7852 | 36.0226 |
83
+ | 0.0011 | 30.0 | 5700 | 4.0687 | 41.588 | 23.5096 | 35.9281 | 36.128 |
84
 
85
 
86
  ### Framework versions