metadata
base_model:
- Qwen/Qwen2-7B
library_name: transformers
tags:
- mergekit
- merge
Qwen2-11.3B
NOTE: This a model to be used for continued pretraining or finetuning! It is not a model to run on it's own. There will be some coming soon.
Perplexity measurements: PPL = 7.7614 +/- 0.10721
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 22]
model: Qwen/Qwen2-7B
- sources:
- layer_range: [6, 22]
model: Qwen/Qwen2-7B
parameters:
scale:
- filter: o_proj
value: 0.0
- filter: down_proj
value: 0.0
- value: 1.0
- sources:
- layer_range: [22, 28]
model: Qwen/Qwen2-7B