🌟 Checkout New Taiwan-LLM Demo Chat-UI 🌟

Taiwan LLM based on Mistral-7B-v0.1

continue pretraining on 20 billion tokens in traditional Mandarin and instruction fine-tuning on millions of conversations.

This version does NOT include CommonCrawl.

Collaboration with Ubitus K.K. 💪💪💪

本項目與 Ubitus K.K. 合作進行。Ubitus 為本項目提供寶貴的技術支持和計算資源。

Taiwan LLM v2 is conducted in collaboration with Ubitus K.K.. Ubitus provides valuable technical support and compute resources for the project.

Citation

@misc{lin2023taiwan,
      title={Taiwan LLM: Bridging the Linguistic Divide with a Culturally Aligned Language Model}, 
      author={Yen-Ting Lin and Yun-Nung Chen},
      year={2023},
      eprint={2311.17487},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
Downloads last month
7
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.