--- base_model: unsloth/phi-4-unsloth-bnb-4bit tags: - text-generation-inference - transformers - unsloth - sebaxakerhtc - llama - trl - bggpt license: apache-2.0 language: - bg datasets: - burgasdotpro/synthetic_dataset - burgasdotpro/wikipedia --- Актуализиран 20.01.2025 - Changed training params - Updated dataset - Continued pretraining - Уикипедиа 5% **bgGPT-Phi-4-CPT-GGUF.Q8_0.gguf е с много добри резултати. Continued Pretraining. Wiki само 1%. Ще го оставя за сравнение.** Тази Phi-4 модела тренирана 2 пъти по-бързо с помоща на [Unsloth](https://github.com/unslothai/unsloth) и TRL библиотеката на Huggingface. # Uploaded model - **Developed by:** burgasdotpro - **License:** apache-2.0 - **Finetuned from model :** unsloth/phi-4-unsloth-bnb-4bit [](https://github.com/unslothai/unsloth)