license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- google/gemma-7b-it | |
- HuggingFaceH4/zephyr-7b-gemma-v0.1 | |
- mlabonne/Gemmalpaca-7B | |
# gemma-7b-zephyr-alpaca-it-ties | |
gemma-7b-zephyr-alpaca-it-ties is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): | |
* [google/gemma-7b-it](https://huggingface.co/google/gemma-7b-it) | |
* [HuggingFaceH4/zephyr-7b-gemma-v0.1](https://huggingface.co/HuggingFaceH4/zephyr-7b-gemma-v0.1) | |
* [mlabonne/Gemmalpaca-7B](https://huggingface.co/mlabonne/Gemmalpaca-7B) | |
## 🧩 Configuration | |
```yaml | |
models: | |
- model: google/gemma-7b-it | |
parameters: | |
density: 0.5 | |
weight: 0.3 | |
- model: HuggingFaceH4/zephyr-7b-gemma-v0.1 | |
parameters: | |
density: 0.5 | |
weight: 0.3 # weight gradient | |
- model: mlabonne/Gemmalpaca-7B | |
parameters: | |
density: 0.5 | |
weight: 0.3 # weight gradient | |
merge_method: dare_ties | |
base_model: google/gemma-7b | |
parameters: | |
normalize: true | |
int8_mask: true | |
dtype: float16 | |
``` |