Add BibTex entry
Browse files
README.md
CHANGED
@@ -106,4 +106,13 @@ The model was trained on 32 V100 GPUs for 31,250 steps with the batch size of 8,
|
|
106 |
|
107 |
**BibTeX entry and citation info**
|
108 |
|
109 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
106 |
|
107 |
**BibTeX entry and citation info**
|
108 |
|
109 |
+
```
|
110 |
+
@misc{lowphansirikul2021wangchanberta,
|
111 |
+
title={WangchanBERTa: Pretraining transformer-based Thai Language Models},
|
112 |
+
author={Lalita Lowphansirikul and Charin Polpanumas and Nawat Jantrakulchai and Sarana Nutanong},
|
113 |
+
year={2021},
|
114 |
+
eprint={2101.09635},
|
115 |
+
archivePrefix={arXiv},
|
116 |
+
primaryClass={cs.CL}
|
117 |
+
}
|
118 |
+
```
|