license: mit | |
language: | |
- tr | |
library_name: transformers | |
Pretrained on roughly **1.6B** (mostly Turkish) tokens from HF and "high quality" scraped data using 1 RTX 3090. The training will continue. The model already can be (sort of) fine-tuned for instruction. | |
___________________________ | |
HF kaynaklı ve scrape edilen yaklaşık **1.6 Milyar** (çoğunlukla Türkçe) token ile 1 RTX 3090 kullanılarak eğitilmiştir. Model şimdiden talimatlar için fine-tune edilebiliyor: | |
![image/png](/static-proxy?url=https%3A%2F%2Fcdn-uploads.huggingface.co%2Fproduction%2Fuploads%2F6324eabf05bd8a54c6eb1650%2FSlUmBi6Mmb5NuQesvucv3.png%3C%2Fspan%3E)%3C%2Fspan%3E%3C!-- HTML_TAG_END --> | |
max_length=256, top_k=20, min_p=0.1, repetition_penalty=1.1, temperature=0.1, seed=22366 / TR_4k_LoRA | |