model update
Browse files
README.md
CHANGED
@@ -51,21 +51,6 @@ model-index:
|
|
51 |
- name: MoverScore (Question Generation)
|
52 |
type: moverscore_question_generation
|
53 |
value: 58.34
|
54 |
-
- name: BLEU4 (Question & Answer Generation (with Gold Answer))
|
55 |
-
type: bleu4_question_answer_generation_with_gold_answer
|
56 |
-
value: 1.73
|
57 |
-
- name: ROUGE-L (Question & Answer Generation (with Gold Answer))
|
58 |
-
type: rouge_l_question_answer_generation_with_gold_answer
|
59 |
-
value: 14.86
|
60 |
-
- name: METEOR (Question & Answer Generation (with Gold Answer))
|
61 |
-
type: meteor_question_answer_generation_with_gold_answer
|
62 |
-
value: 21.82
|
63 |
-
- name: BERTScore (Question & Answer Generation (with Gold Answer))
|
64 |
-
type: bertscore_question_answer_generation_with_gold_answer
|
65 |
-
value: 68.93
|
66 |
-
- name: MoverScore (Question & Answer Generation (with Gold Answer))
|
67 |
-
type: moverscore_question_answer_generation_with_gold_answer
|
68 |
-
value: 51.59
|
69 |
- name: QAAlignedF1Score-BERTScore (Question & Answer Generation (with Gold Answer))
|
70 |
type: qa_aligned_f1_score_bertscore_question_answer_generation_with_gold_answer
|
71 |
value: 79.06
|
@@ -167,20 +152,12 @@ question = pipe("extract answers: <hl> En la diáspora somalí, múltiples event
|
|
167 |
|
168 |
| | Score | Type | Dataset |
|
169 |
|:--------------------------------|--------:|:--------|:-----------------------------------------------------------------|
|
170 |
-
| BERTScore | 68.93 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
171 |
-
| Bleu_1 | 10.52 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
172 |
-
| Bleu_2 | 5.19 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
173 |
-
| Bleu_3 | 2.82 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
174 |
-
| Bleu_4 | 1.73 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
175 |
-
| METEOR | 21.82 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
176 |
-
| MoverScore | 51.59 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
177 |
| QAAlignedF1Score (BERTScore) | 79.06 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
178 |
| QAAlignedF1Score (MoverScore) | 54.49 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
179 |
| QAAlignedPrecision (BERTScore) | 76.46 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
180 |
| QAAlignedPrecision (MoverScore) | 52.96 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
181 |
| QAAlignedRecall (BERTScore) | 81.94 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
182 |
| QAAlignedRecall (MoverScore) | 56.21 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
183 |
-
| ROUGE_L | 14.86 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
184 |
|
185 |
|
186 |
- ***Metric (Answer Extraction)***: [raw metric file](https://huggingface.co/lmqg/mt5-small-esquad-qg-ae/raw/main/eval/metric.first.answer.paragraph_sentence.answer.lmqg_qg_esquad.default.json)
|
|
|
51 |
- name: MoverScore (Question Generation)
|
52 |
type: moverscore_question_generation
|
53 |
value: 58.34
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
54 |
- name: QAAlignedF1Score-BERTScore (Question & Answer Generation (with Gold Answer))
|
55 |
type: qa_aligned_f1_score_bertscore_question_answer_generation_with_gold_answer
|
56 |
value: 79.06
|
|
|
152 |
|
153 |
| | Score | Type | Dataset |
|
154 |
|:--------------------------------|--------:|:--------|:-----------------------------------------------------------------|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
155 |
| QAAlignedF1Score (BERTScore) | 79.06 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
156 |
| QAAlignedF1Score (MoverScore) | 54.49 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
157 |
| QAAlignedPrecision (BERTScore) | 76.46 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
158 |
| QAAlignedPrecision (MoverScore) | 52.96 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
159 |
| QAAlignedRecall (BERTScore) | 81.94 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
160 |
| QAAlignedRecall (MoverScore) | 56.21 | default | [lmqg/qg_esquad](https://huggingface.co/datasets/lmqg/qg_esquad) |
|
|
|
161 |
|
162 |
|
163 |
- ***Metric (Answer Extraction)***: [raw metric file](https://huggingface.co/lmqg/mt5-small-esquad-qg-ae/raw/main/eval/metric.first.answer.paragraph_sentence.answer.lmqg_qg_esquad.default.json)
|