Typhoon 2 Text
Collection
Latest Official Text ThaiLLM release by SCB 10X.
โข
11 items
โข
Updated
โข
3
Llama3.1-Typhoon2-8B: Thai Large Language Model (Instruct)
Llama3.1-Typhoon2-8B is a pretrained only Thai ๐น๐ญ large language model with 8 billion parameters, and it is based on Llama3.1-8B.
For technical-report. please see our arxiv. *To acknowledge Meta's effort in creating the foundation model and to comply with the license, we explicitly include "llama-3.1" in the model name.
Model | ThaiExam | ONET | IC | A-Level | TGAT | TPAT | M3Exam | Math | Science | Social | Thai |
---|---|---|---|---|---|---|---|---|---|---|---|
Typhoon2 Llama3.1 8B Base | 51.20% | 49.38% | 47.36% | 43.30% | 67.69% | 48.27% | 47.52% | 27.60% | 44.20% | 68.90% | 49.38% |
Llama3.1 8B | 45.80% | 38.27% | 46.31% | 34.64% | 61.53% | 48.27% | 43.33% | 27.14% | 40.82% | 58.33% | 47.05% |
Typhoon1.5 Llama3 8B Base | 48.82% | 41.35% | 41.05% | 40.94% | 70.76% | 50.00% | 43.88% | 22.62% | 43.47% | 62.81% | 46.63% |
This model is a pretrained base model. Thus, it may not be able to follow human instructions without using one/few-shot learning or instruction fine-tuning. The model does not have any moderation mechanisms, and may generate harmful or inappropriate responses.
https://twitter.com/opentyphoon
@misc{typhoon2,
title={Typhoon 2: A Family of Open Text and Multimodal Thai Large Language Models},
author={Kunat Pipatanakul and Potsawee Manakul and Natapong Nitarach and Warit Sirichotedumrong and Surapon Nonesung and Teetouch Jaknamon and Parinthapat Pengpun and Pittawat Taveekitworachai and Adisai Na-Thalang and Sittipong Sripaisarnmongkol and Krisanapong Jirayoot and Kasima Tharnpipitchai},
year={2024},
eprint={2412.13702},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2412.13702},
}