Improve model card: Add pipeline tag, license, project page, and enhance links

#3
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +37 -28
README.md CHANGED
@@ -1,28 +1,21 @@
1
  ---
2
  library_name: transformers
3
- tags:
4
- - translation
 
5
  ---
6
 
 
7
 
8
- <p align="center">
9
- <img src="https://dscache.tencent-cloud.cn/upload/uploader/hunyuan-64b418fd052c033b228e04bc77bbc4b54fd7f5bc.png" width="400"/> <br>
10
- </p><p></p>
11
-
12
 
13
  <p align="center">
14
- 🤗&nbsp;<a href="https://huggingface.co/collections/tencent/hunyuan-mt-68b42f76d473f82798882597"><b>Hugging Face</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
15
- 🤖&nbsp;<a href="https://modelscope.cn/collections/Hunyuan-MT-2ca6b8e1b4934f"><b>ModelScope</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
16
- 🪡&nbsp;<a href="https://github.com/Tencent/AngelSlim/tree/main"><b>AngelSlim</b></a>
17
- </p>
18
-
19
- <p align="center">
20
- 🖥️&nbsp;<a href="https://hunyuan.tencent.com"><b>Official Website</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
21
- 🕹️&nbsp;<a href="https://hunyuan.tencent.com/modelSquare/home/list"><b>Demo</b></a>&nbsp;&nbsp;&nbsp;&nbsp;
22
  </p>
23
-
24
  <p align="center">
25
- <a href="https://github.com/Tencent-Hunyuan/Hunyuan-MT"><b>GITHUB</b></a>
 
 
26
  </p>
27
 
28
 
@@ -41,6 +34,16 @@ Hunyuan-MT-Chimera-7B-fp8 was produced by [AngelSlim](https://github.com/Tencent
41
  * 2025.9.1 We have open-sourced **Hunyuan-MT-7B** , **Hunyuan-MT-Chimera-7B** on Hugging Face.
42
  <br>
43
 
 
 
 
 
 
 
 
 
 
 
44
 
45
  &nbsp;
46
 
@@ -101,8 +104,6 @@ First, please install transformers, recommends v4.56.0
101
  pip install transformers==4.56.0
102
  ```
103
 
104
- The following code snippet shows how to use the transformers library to load and apply the model.
105
-
106
  *!!! If you want to load fp8 model with transformers, you need to change the name"ignored_layers" in config.json to "ignore" and upgrade the compressed-tensors to compressed-tensors-0.11.0.*
107
 
108
  we use tencent/Hunyuan-MT-7B for example
@@ -116,7 +117,9 @@ model_name_or_path = "tencent/Hunyuan-MT-7B"
116
  tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
117
  model = AutoModelForCausalLM.from_pretrained(model_name_or_path, device_map="auto") # You may want to use bfloat16 and/or move to GPU here
118
  messages = [
119
- {"role": "user", "content": "Translate the following segment into Chinese, without additional explanation.\n\nIt’s on the house."},
 
 
120
  ]
121
  tokenized_chat = tokenizer.apply_chat_template(
122
  messages,
@@ -173,7 +176,7 @@ Supported languages:
173
  | Telugu | te | 泰卢固语 |
174
  | Marathi | mr | 马拉地语 |
175
  | Hebrew | he | 希伯来语 |
176
- | Bengali | bn | 孟加拉语 |
177
  | Tamil | ta | 泰米尔语 |
178
  | Ukrainian | uk | 乌克兰语 |
179
  | Tibetan | bo | 藏语 |
@@ -182,13 +185,19 @@ Supported languages:
182
  | Uyghur | ug | 维吾尔语 |
183
  | Cantonese | yue | 粤语 |
184
 
185
- Citing Hunyuan-MT:
186
-
187
  ```bibtex
188
- @misc{hunyuanmt2025,
189
- title={Hunyuan-MT Technical Report},
190
- author={Mao Zheng, Zheng Li, Bingxin Qu, Mingyang Song, Yang Du, Mingrui Sun, Di Wang, Tao Chen, Jiaqi Zhu, Xingwu Sun, Yufei Wang, Can Xu, Chen Li, Kai Wang, Decheng Wu},
191
- howpublished={\url{https://github.com/Tencent-Hunyuan/Hunyuan-MT}},
192
- year={2025}
 
 
 
193
  }
194
- ```
 
 
 
 
 
1
  ---
2
  library_name: transformers
3
+ pipeline_tag: translation
4
+ license: apache-2.0
5
+ project_page: https://hunyuan.tencent.com
6
  ---
7
 
8
+ # Hunyuan-MT-Chimera-7B-fp8: Multilingual Translation Model
9
 
10
+ This model is presented in the paper [Hunyuan-MT Technical Report](https://huggingface.co/papers/2509.05209).
 
 
 
11
 
12
  <p align="center">
13
+ <img src="https://dscache.tencent-cloud.cn/upload/uploader/hunyuan-64b418fd052c033b228e04bc77bbc4b54fd7f5bc.png" width="400"/> <br>
 
 
 
 
 
 
 
14
  </p>
 
15
  <p align="center">
16
+ 📚 <a href="https://huggingface.co/papers/2509.05209"><b>Paper</b></a> &nbsp;&nbsp;|&nbsp;&nbsp; 💻 <a href="https://github.com/Tencent-Hunyuan/Hunyuan-MT"><b>GitHub</b></a> &nbsp;&nbsp;|&nbsp;&nbsp; 🏠 <a href="https://hunyuan.tencent.com"><b>Official Website</b></a> &nbsp;&nbsp;|&nbsp;&nbsp; 🕹️ <a href="https://hunyuan.tencent.com/modelSquare/home/list"><b>Demo</b></a>
17
+ <br>
18
+ 🤗 <a href="https://huggingface.co/collections/tencent/hunyuan-mt-68b42f76d473f82798882597"><b>Hugging Face Collection</b></a> &nbsp;&nbsp;|&nbsp;&nbsp; 🤖 <a href="https://modelscope.cn/collections/Hunyuan-MT-2ca6b8e1b4934f"><b>ModelScope</b></a> &nbsp;&nbsp;|&nbsp;&nbsp; 🪡 <a href="https://github.com/Tencent/AngelSlim/tree/main"><b>AngelSlim</b></a>
19
  </p>
20
 
21
 
 
34
  * 2025.9.1 We have open-sourced **Hunyuan-MT-7B** , **Hunyuan-MT-Chimera-7B** on Hugging Face.
35
  <br>
36
 
37
+ &nbsp;
38
+
39
+ ## Performance
40
+
41
+ <div align='center'>
42
+ <img src="https://github.com/Tencent-Hunyuan/Hunyuan-MT/raw/main/imgs/overall_performance.png" width = "80%" />
43
+ </div>
44
+ You can refer to our technical report for more experimental results and analysis.
45
+
46
+ [**Technical Report**](https://www.arxiv.org/pdf/2509.05209)
47
 
48
  &nbsp;
49
 
 
104
  pip install transformers==4.56.0
105
  ```
106
 
 
 
107
  *!!! If you want to load fp8 model with transformers, you need to change the name"ignored_layers" in config.json to "ignore" and upgrade the compressed-tensors to compressed-tensors-0.11.0.*
108
 
109
  we use tencent/Hunyuan-MT-7B for example
 
117
  tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
118
  model = AutoModelForCausalLM.from_pretrained(model_name_or_path, device_map="auto") # You may want to use bfloat16 and/or move to GPU here
119
  messages = [
120
+ {"role": "user", "content": "Translate the following segment into Chinese, without additional explanation.
121
+
122
+ It’s on the house."},
123
  ]
124
  tokenized_chat = tokenizer.apply_chat_template(
125
  messages,
 
176
  | Telugu | te | 泰卢固语 |
177
  | Marathi | mr | 马拉地语 |
178
  | Hebrew | he | 希伯来语 |
179
+ | Bengali | bn | 孟加拉拉语 |
180
  | Tamil | ta | 泰米尔语 |
181
  | Ukrainian | uk | 乌克兰语 |
182
  | Tibetan | bo | 藏语 |
 
185
  | Uyghur | ug | 维吾尔语 |
186
  | Cantonese | yue | 粤语 |
187
 
188
+ ## Citation
 
189
  ```bibtex
190
+ @misc{hunyuan_mt,
191
+ title={Hunyuan-MT Technical Report},
192
+ author={Mao Zheng and Zheng Li and Bingxin Qu and Mingyang Song and Yang Du and Mingrui Sun and Di Wang},
193
+ year={2025},
194
+ eprint={2509.05209},
195
+ archivePrefix={arXiv},
196
+ primaryClass={cs.CL},
197
+ url={https://arxiv.org/abs/2509.05209},
198
  }
199
+ ```
200
+
201
+ ## Contact Us
202
+
203
+ If you would like to leave a message for our R&D and product teams, Welcome to contact our open-source team . You can also contact us via email ([email protected]).