File size: 1,262 Bytes
7314dbd 21e3bc4 7314dbd 21e3bc4 7314dbd 21e3bc4 7314dbd 21e3bc4 7314dbd 21e3bc4 7314dbd 21e3bc4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
---
base_model: NyxKrage/Microsoft_Phi-4
library_name: peft
license: cc0-1.0
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** Shinoji Research
- **Funded by [optional]:** Shinoji Research
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** Purely machine generated works are not eligilble for copyright protection. Out of respect for the license of Phi-4, we will only distribute the adapter file (for now).
- **Finetuned from model NyxKrage/Microsoft_Phi-4:*
### Model Sources [optional]
## Uses
This is a super undertrained preview of Phi-4 trained on PowerInfer/QWQ-LONGCOT-500K dataset
, heavily inspired by: https://huggingface.co/PowerInfer/SmallThinker-3B-Preview. It does produce slightly different responses than Phi-base but needs more training.
Assuming nothing goes wrong with the training process, in about 2 weeks we should have a completed version.
## Training Details
### Training Data
Trained on https://huggingface.co/datasets/PowerInfer/QWQ-LONGCOT-500K
|