Commit
·
1469b15
1
Parent(s):
a41091a
Add AveniBench split of ECTSum.
Browse files- README.md +53 -0
- data/test.jsonl +0 -0
README.md
CHANGED
@@ -1,3 +1,56 @@
|
|
1 |
---
|
2 |
license: gpl-3.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: gpl-3.0
|
3 |
+
language:
|
4 |
+
- en
|
5 |
+
configs:
|
6 |
+
- config_name: aveni-bench-ectsum
|
7 |
+
data_files:
|
8 |
+
- split: test
|
9 |
+
path:
|
10 |
+
- data/test.jsonl
|
11 |
+
default: true
|
12 |
+
tags:
|
13 |
+
- aveni-bench
|
14 |
+
- summarisation
|
15 |
+
- finnlp
|
16 |
---
|
17 |
+
# AveniBench: ECTSum
|
18 |
+
|
19 |
+
ECTSum split used in the AveniBench.
|
20 |
+
|
21 |
+
## License
|
22 |
+
|
23 |
+
This dataset is made available under the GPL-3.0 license.
|
24 |
+
|
25 |
+
## Citation
|
26 |
+
|
27 |
+
AveniBench
|
28 |
+
```bibtex
|
29 |
+
TDB
|
30 |
+
```
|
31 |
+
|
32 |
+
ECTSum
|
33 |
+
```bibtex
|
34 |
+
@inproceedings{mukherjee-etal-2022-ectsum,
|
35 |
+
title = "{ECTS}um: A New Benchmark Dataset For Bullet Point Summarization of Long Earnings Call Transcripts",
|
36 |
+
author = "Mukherjee, Rajdeep and
|
37 |
+
Bohra, Abhinav and
|
38 |
+
Banerjee, Akash and
|
39 |
+
Sharma, Soumya and
|
40 |
+
Hegde, Manjunath and
|
41 |
+
Shaikh, Afreen and
|
42 |
+
Shrivastava, Shivani and
|
43 |
+
Dasgupta, Koustuv and
|
44 |
+
Ganguly, Niloy and
|
45 |
+
Ghosh, Saptarshi and
|
46 |
+
Goyal, Pawan",
|
47 |
+
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
48 |
+
month = dec,
|
49 |
+
year = "2022",
|
50 |
+
address = "Abu Dhabi, United Arab Emirates",
|
51 |
+
publisher = "Association for Computational Linguistics",
|
52 |
+
url = "https://aclanthology.org/2022.emnlp-main.748/",
|
53 |
+
doi = "10.18653/v1/2022.emnlp-main.748",
|
54 |
+
pages = "10893--10906",
|
55 |
+
}
|
56 |
+
```
|
data/test.jsonl
ADDED
The diff for this file is too large to render.
See raw diff
|
|