Update README.md
Browse filesAccording to https://github.com/facebookresearch/fairseq, "fairseq(-py) is MIT-licensed. The license applies to the pre-trained models as well." Therefore, as mBART is explicitly stated as a fine-tune, it is tied to Fairseq (see Fairseq repository), and inherits the MIT license.
README.md
CHANGED
@@ -56,6 +56,7 @@ language:
|
|
56 |
tags:
|
57 |
- mbart-50
|
58 |
pipeline_tag: translation
|
|
|
59 |
---
|
60 |
|
61 |
# mBART-50 many to many multilingual machine translation
|
@@ -115,4 +116,4 @@ Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX),
|
|
115 |
archivePrefix={arXiv},
|
116 |
primaryClass={cs.CL}
|
117 |
}
|
118 |
-
```
|
|
|
56 |
tags:
|
57 |
- mbart-50
|
58 |
pipeline_tag: translation
|
59 |
+
license: mit
|
60 |
---
|
61 |
|
62 |
# mBART-50 many to many multilingual machine translation
|
|
|
116 |
archivePrefix={arXiv},
|
117 |
primaryClass={cs.CL}
|
118 |
}
|
119 |
+
```
|