--- base_model: - appvoid/palmer-002-32k - raidhon/coven_tiny_1.1b_32k_orpo_alpha - appvoid/palmer-003 library_name: transformers tags: - mergekit - merge license: apache-2.0 ---
![palmer-004](https://huggingface.co/appvoid/palmer-004/resolve/main/palmer-004.jpeg) #### june update This model has improved overall performance at the expense of small degradation on winogrande. As all palmer models, the model is biased to respond to answers without using any specific prompt, feel free to further fine-tune it for your specific use case. | Model | MMLU | ARC-C | HellaSwag | PIQA | Winogrande | Average | |--------------------------------|-------|-------|-----------|--------|------------|---------| | tinyllama-3t | 0.2577| 0.3029| 0.5935 | 0.7329 | 0.5959 | 0.4966 | | palmer-004 | 0.2601| 0.3456| 0.6138 | 0.7443 | **0.6511** | 0.5229 | | palmer-004-2406 | 0.2661| 0.3490| **0.6173** | **0.7481** | 0.6417 | **0.5244** |