Coobiw's picture
Update README.md
1fb0e25 verified
metadata
license: apache-2.0
pipeline_tag: image-text-to-text
library_name: transformers
paper: https://arxiv.org/abs/2409.03277

ChartMoE

ICLR2025 Oral

arXiv

Project Page

Github Repo

Hugging Face Model

ChartMoE is a multimodal large language model with Mixture-of-Expert connector, based on InternLM-XComposer2 for advanced chart 1)understanding, 2)replot, 3)editing, 4)highlighting and 5)transformation.

This is a reproduction of diversely-aligner moe-connector, please feel free to use it for continue sft training!

Open Source License

The data is licensed under Apache-2.0.