Model info
Distilled model from a Tatoeba-MT Teacher: Tatoeba-MT-models/kor-eng/opusTCv20210807-sepvoc_transformer-big_2022-07-28, which has been trained on the Tatoeba dataset.
We used the OpusDistillery to train new a new student with the tiny architecture, with a regular transformer decoder. For training data, we used Tatoeba. The configuration file fed into OpusDistillery can be found here.
How to run
>>> from transformers import pipeline
>>> pipe = pipeline("translation", model="odegiber/ko-en", max_length=256)
>>> pipe("2017๋
๋ง, ์๋ฏธ๋
ธํ๋ ์ผํ ํ
๋ ๋น์ ผ ์ฑ๋์ธ QVC์ ์ถ์ฐํ๋ค.")
[{'translation_text': 'At the end of 2017, Siminof appeared on the shopping television channel QVC.'}]
Benchmarks
testset | BLEU | chr-F |
---|---|---|
flores200 | 20.3 | 50.3 |
- Downloads last month
- 23
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.