--- license: apache-2.0 datasets: - yentinglin/v1 language: - zh tags: - traditional mandarin - traditional chinese - taiwan - moe - mixtral - zh-tw - zh-hant pretty_name: twllm-moe --- # Taiwan LLM Mixtrue of Export - Pilot run ![image/png](https://cdn-uploads.huggingface.co/production/uploads/5df9c78eda6d0311fd3d541f/AMGN-A-fUsaQg-lF35Pzj.png) ## Model Details ### Model Description - **Developed by:** [Yen-Ting Lin 林彥廷](https://yentingl.com/) - **Compute Funded by:** [HelperAI](https://helperai.ai/) - **Model type:** [Mixtral](https://huggingface.co/docs/transformers/model_doc/mixtral) - **Language(s) (NLP):** Traditional Mandarin (zh-tw) - **License:** [Apache-2.0](https://www.apache.org/licenses/LICENSE-2.0) - **Finetuned from model:** [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) - **TMMLUS+ score:** 38.09223090909092 ### Model Sources - **Repository:** [Taiwan-LLM](https://github.com/MiuLab/Taiwan-LLM) - **Paper:** [Taiwan-LLM: Bridging the Linguistic Divide with a Culturally Aligned Language Model](https://arxiv.org/pdf/2311.17487.pdf) - **Demo:** [Taiwan LLM ChatUI](https://twllm.com/) ## Citation **BibTeX:** ```bibtex @misc{lin2023taiwan, title={Taiwan LLM: Bridging the Linguistic Divide with a Culturally Aligned Language Model}, author={Yen-Ting Lin and Yun-Nung Chen}, year={2023}, eprint={2311.17487}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```