Coobiw commited on
Commit
1fb0e25
·
verified ·
1 Parent(s): c4c5eb2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +40 -3
README.md CHANGED
@@ -1,3 +1,40 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ pipeline_tag: image-text-to-text
4
+ library_name: transformers
5
+ paper: https://arxiv.org/abs/2409.03277
6
+ ---
7
+
8
+ <p align="center">
9
+ <b><font size="6">ChartMoE</font></b>
10
+ <p>
11
+ <p align="center">
12
+ <b><font size="4">ICLR2025 Oral </font></b>
13
+ <p>
14
+
15
+ <div align="center">
16
+ <div style="display: inline-block; margin-right: 30px;">
17
+
18
+ [![arXiv](https://img.shields.io/badge/ArXiv-Prepint-red)](https://arxiv.org/abs/2409.03277)
19
+ </div>
20
+ <div style="display: inline-block; margin-right: 30px;">
21
+
22
+ [![Project Page](https://img.shields.io/badge/Project-Page-brightgreen)](https://chartmoe.github.io/)
23
+ </div>
24
+ <div style="display: inline-block; margin-right: 30px;">
25
+
26
+ [![Github Repo](https://img.shields.io/badge/Github-Repo-blue)](https://github.com/IDEA-FinAI/ChartMoE)
27
+ </div>
28
+ <div style="display: inline-block; margin-right: 30px;">
29
+
30
+ [![Hugging Face Model](https://img.shields.io/badge/Hugging%20Face-Model-8A2BE2)](https://huggingface.co/IDEA-FinAI/chartmoe)
31
+ </div>
32
+ </div>
33
+
34
+
35
+ **ChartMoE** is a multimodal large language model with Mixture-of-Expert connector, based on [InternLM-XComposer2](https://github.com/InternLM/InternLM-XComposer/tree/main/InternLM-XComposer-2.0) for advanced chart 1)understanding, 2)replot, 3)editing, 4)highlighting and 5)transformation.
36
+
37
+ **This is a reproduction of diversely-aligner moe-connector, please feel free to use it for continue sft training!**
38
+
39
+ ## Open Source License
40
+ The data is licensed under Apache-2.0.