metadata
base_model:
- Khetterman/CursedMatrix-8B-v9
- aloobun/CosmicBun-8B-DPO
- Arkana08/LexiMaid-L3-8B
- Arkana08/Mythorica-L3-8B
- bluuwhale/L3-SthenoMaidBlackroot-8B-V1
- Casual-Autopsy/L3-Luna-8B
- IlyaGusev/saiga_llama3_8b
- invisietch/L3.1-EtherealRainbow-v1.0-rc1-8B
- jeiku/Average_Normie_v3.69_8B
- SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA
- v000000/L3-8B-BlueSerpentine
- ZeroXClem/L3SAO-Mix-SuperHermes-NovaPurosani-8B
- ZeroXClem/Llama-3-Aetheric-Hermes-Lexi-Smaug-8B
- ZeroXClem/Llama3.1-TheiaFire-DarkFusion-8B
library_name: transformers
tags:
- mergekit
- merge
- bfloat16
- safetensors
- 8b
- chat
- creative
- roleplay
- conversational
- not-for-all-audiences
language:
- en
- ru
Kosmos-8B-v1
The serenity of infinity is not the end.
This is an interesting merge of 14 cool models, created using mergekit. Enjoy exploring :)
Merge Details
Method
This model was merged using the multistep process and remerge with some model variations for best result.
Models
The following models were included in the merge:
- Khetterman/CursedMatrix-8B-v9
- aloobun/CosmicBun-8B-DPO
- Arkana08/LexiMaid-L3-8B
- Arkana08/Mythorica-L3-8B
- bluuwhale/L3-SthenoMaidBlackroot-8B-V1
- Casual-Autopsy/L3-Luna-8B
- IlyaGusev/saiga_llama3_8b
- invisietch/L3.1-EtherealRainbow-v1.0-rc1-8B
- jeiku/Average_Normie_v3.69_8B
- SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA
- v000000/L3-8B-BlueSerpentine
- ZeroXClem/L3SAO-Mix-SuperHermes-NovaPurosani-8B
- ZeroXClem/Llama-3-Aetheric-Hermes-Lexi-Smaug-8B
- ZeroXClem/Llama3.1-TheiaFire-DarkFusion-8B
Configuration
The following YAML configurations was used to produce this model:
# Cursed-UnalignedCosmicSaiga-8B-v1
models:
- model: SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA
- model: aloobun/CosmicBun-8B-DPO
- model: IlyaGusev/saiga_llama3_8b
merge_method: model_stock
base_model: Khetterman/CursedMatrix-8B-v9
dtype: bfloat16
# Cursed-BlueRainbowMaid-8B-v1
models:
- model: v000000/L3-8B-BlueSerpentine
- model: invisietch/L3.1-EtherealRainbow-v1.0-rc1-8B
- model: bluuwhale/L3-SthenoMaidBlackroot-8B-V1
merge_method: model_stock
base_model: Khetterman/CursedMatrix-8B-v9
dtype: bfloat16
# Cursed-AverageLunaFusion-8B-v1
models:
- model: jeiku/Average_Normie_v3.69_8B
- model: Casual-Autopsy/L3-Luna-8B
- model: ZeroXClem/Llama3.1-TheiaFire-DarkFusion-8B
merge_method: model_stock
base_model: Khetterman/CursedMatrix-8B-v9
dtype: bfloat16
# InfectedKosmos-8B-v1
models:
- model: F:/Cursed-UnalignedCosmicSaiga-8B-v1
- model: F:/Cursed-BlueRainbowMaid-8B-v1
- model: F:/Cursed-AverageLunaFusion-8B-v1
merge_method: model_stock
base_model: Khetterman/CursedMatrix-8B-v9
dtype: bfloat16
# ZeroArkana-A
models:
- model: ZeroXClem/L3SAO-Mix-SuperHermes-NovaPurosani-8B
parameters:
weight: [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
density: [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
- model: Arkana08/LexiMaid-L3-8B
parameters:
weight: [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
density: [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
merge_method: della
parameters:
epsilon: 0.1
lambda: 1.0
base_model: F:/InfectedKosmos-8B-v1
dtype: bfloat16
# ZeroArkana-B
models:
- model: ZeroXClem/Llama-3-Aetheric-Hermes-Lexi-Smaug-8B
parameters:
weight: [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
density: [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
- model: Arkana08/Mythorica-L3-8B
parameters:
weight: [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
density: [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
merge_method: della
parameters:
epsilon: 0.1
lambda: 1.0
base_model: F:/InfectedKosmos-8B-v1
dtype: bfloat16
# Kosmos-8B-v1
models:
- model: F:/ZeroArkana-A
- model: F:/ZeroArkana-B
merge_method: model_stock
base_model: F:/InfectedKosmos-8B-v1
dtype: bfloat16
My thanks to the authors of the original models, your work is incredible. Have a good time 🖤