Update README.md (#3)
Browse files- Update README.md (ba0a81b5dd01f9bb196de8531f24307dbbc4c7f3)
Co-authored-by: Pengzhi Gao <[email protected]>
README.md
CHANGED
@@ -40,26 +40,18 @@ language:
|
|
40 |
- zh
|
41 |
---
|
42 |
|
43 |
-
# Model Card for GemmaX2-28
|
44 |
|
45 |
-
## Model
|
46 |
|
47 |
-
|
48 |
|
49 |
-
GemmaX2-28-2B-Pretrain is a language model that results from continual pretraining of Gemma2-2B on a mix of 56 billion tokens of monolingual and parallel data in 28 different languages — Arabic, Bengali, Czech, German, English, Spanish, Persian, French, Hebrew, Hindi, Indonesian, Italian, Japanese, Khmer, Korean, Lao, Malay, Burmese, Dutch, polish, Portuguese, Russian, Thai, Tagalog, Turkish, Urdu, Vietnamese, Chinese.
|
50 |
-
|
51 |
-
GemmaX2-28-2B-v0.1 is the model version of GemmaX2-28-2B-Pretrain after SFT.
|
52 |
|
53 |
- **Developed by:** Xiaomi
|
54 |
-
- **Model type:**
|
55 |
-
- **
|
56 |
-
- **License:** gemma
|
57 |
-
|
58 |
-
### Model Source
|
59 |
|
60 |
-
- paper: [Multilingual Machine Translation with Open Large Language Models at Practical Scale: An Empirical Study](https://arxiv.org/pdf/2502.02481)
|
61 |
|
62 |
-
|
63 |
|
64 |

|
65 |
|
@@ -99,4 +91,4 @@ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
|
|
99 |
|
100 |
## Limitations
|
101 |
|
102 |
-
GemmaX2-28-2B-v0.1 supports
|
|
|
40 |
- zh
|
41 |
---
|
42 |
|
|
|
43 |
|
44 |
+
## Model Description
|
45 |
|
46 |
+
GemmaX2-28-2B-v0.1 is an LLM-based translation model. It has been fintuned on GemmaX2-28-2B-Pretrain, which is a language model developed through continual pretraining of Gemma2-2B using a mix of 56 billion tokens from both monolingual and parallel data across 28 different languages. Please find more details in our paper: [Multilingual Machine Translation with Open Large Language Models at Practical Scale: An Empirical Study](https://arxiv.org/pdf/2502.02481).
|
47 |
|
|
|
|
|
|
|
48 |
|
49 |
- **Developed by:** Xiaomi
|
50 |
+
- **Model type:** GemmaX2-28-2B-Pretrain is obtained by continually pretraining Gemma2-2B on a large amount of monolingual and parallel data. Subsequently, GemmaX2-28-2B-v0.1 is derived through supervised finetuning on a small set of high-quality translation instruction data.
|
51 |
+
- **Languages:** Arabic, Bengali, Czech, German, English, Spanish, Persian, French, Hebrew, Hindi, Indonesian, Italian, Japanese, Khmer, Korean, Lao, Malay, Burmese, Dutch, Polish, Portuguese, Russian, Thai, Tagalog, Turkish, Urdu, Vietnamese, Chinese.
|
|
|
|
|
|
|
52 |
|
|
|
53 |
|
54 |
+
## Model Performance
|
55 |
|
56 |

|
57 |
|
|
|
91 |
|
92 |
## Limitations
|
93 |
|
94 |
+
GemmaX2-28-2B-v0.1 only supports the 28 languages listed above and does not guarantee strong translation performance for other languages. We will continue to enhance the translation performance of GemmaX2-28-2B, and future models will be released in due course.
|