Update README.md
Browse files
README.md
CHANGED
@@ -16,6 +16,13 @@ This model was trained as part of the "Extractive QA Biomedicine" project develo
|
|
16 |
|
17 |
Taking into account the existence of masked language models trained on Spanish Biomedical corpus, the objective of this project is to use them to generate extractice QA models for Biomedicine and compare their effectiveness with general masked language models.
|
18 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
19 |
## Description
|
20 |
|
21 |
This model is a fine-tuned version of [PlanTL-GOB-ES/roberta-base-biomedical-clinical-es](https://huggingface.co/PlanTL-GOB-ES/roberta-base-biomedical-clinical-es) on the [squad_es (v2)](https://huggingface.co/datasets/squad_es) training dataset.
|
@@ -37,19 +44,6 @@ The hyperparameters were chosen based on those used in [PlanTL-GOB-ES/roberta-ba
|
|
37 |
|
38 |
Evaluated on the [hackathon-pln-es/biomed_squad_es_v2](https://huggingface.co/datasets/hackathon-pln-es/biomed_squad_es_v2) dev set.
|
39 |
|
40 |
-
```
|
41 |
-
eval_exact = 66.8426
|
42 |
-
eval_f1 = 75.2346
|
43 |
-
|
44 |
-
eval_HasAns_exact = 53.0249
|
45 |
-
eval_HasAns_f1 = 70.0031
|
46 |
-
eval_HasAns_total = 562
|
47 |
-
eval_NoAns_exact = 80.3478
|
48 |
-
eval_NoAns_f1 = 80.3478
|
49 |
-
eval_NoAns_total = 575
|
50 |
-
|
51 |
-
```
|
52 |
-
|
53 |
|Model |Base Model Domain|exact |f1 |HasAns_exact|HasAns_f1|NoAns_exact|NoAns_f1|
|
54 |
|--------------------------------------------------------------|-----------------|-------|-------|------------|---------|-----------|--------|
|
55 |
|hackathon-pln-es/roberta-base-bne-squad2-es |General |67.6341|75.6988|53.7367 |70.0526 |81.2174 |81.2174 |
|
|
|
16 |
|
17 |
Taking into account the existence of masked language models trained on Spanish Biomedical corpus, the objective of this project is to use them to generate extractice QA models for Biomedicine and compare their effectiveness with general masked language models.
|
18 |
|
19 |
+
The model trained during the [Hackathon](https://somosnlp.org/hackathon) were:
|
20 |
+
|
21 |
+
[hackathon-pln-es/roberta-base-bne-squad2-es](https://huggingface.co/hackathon-pln-es/roberta-base-bne-squad2-es)
|
22 |
+
[hackathon-pln-es/roberta-base-biomedical-clinical-es-squad2-es](https://huggingface.co/hackathon-pln-es/roberta-base-biomedical-clinical-es-squad2-es)
|
23 |
+
[hackathon-pln-es/roberta-base-biomedical-es-squad2-es](https://huggingface.co/hackathon-pln-es/roberta-base-biomedical-es-squad2-es)
|
24 |
+
[hackathon-pln-es/biomedtra-small-es-squad2-es](https://huggingface.co/hackathon-pln-es/biomedtra-small-es-squad2-es)
|
25 |
+
|
26 |
## Description
|
27 |
|
28 |
This model is a fine-tuned version of [PlanTL-GOB-ES/roberta-base-biomedical-clinical-es](https://huggingface.co/PlanTL-GOB-ES/roberta-base-biomedical-clinical-es) on the [squad_es (v2)](https://huggingface.co/datasets/squad_es) training dataset.
|
|
|
44 |
|
45 |
Evaluated on the [hackathon-pln-es/biomed_squad_es_v2](https://huggingface.co/datasets/hackathon-pln-es/biomed_squad_es_v2) dev set.
|
46 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
47 |
|Model |Base Model Domain|exact |f1 |HasAns_exact|HasAns_f1|NoAns_exact|NoAns_f1|
|
48 |
|--------------------------------------------------------------|-----------------|-------|-------|------------|---------|-----------|--------|
|
49 |
|hackathon-pln-es/roberta-base-bne-squad2-es |General |67.6341|75.6988|53.7367 |70.0526 |81.2174 |81.2174 |
|