--- license: apache-2.0 tags: - merge - mergekit - passthrough - "frankenmerge" - "7B" - "ZeroXClem/Qwen2.5-7B-HomerCreative-Mix" - "ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix" --- # FMixIA-FrankenMerge-9.5B-PT-9 A merged model using Passthrough layer concatenation, creating a frankenmerge model using [mergekit](https://github.com/cg123/mergekit). ## Model Details - **Base Models**: * [ZeroXClem/Qwen2.5-7B-HomerCreative-Mix](https://huggingface.co/ZeroXClem/Qwen2.5-7B-HomerCreative-Mix) * [ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix](https://huggingface.co/ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix) - **Merge Method**: passthrough - **Note**: This is a frankenmerge model with modified architecture ## Configuration ```yaml slices: - sources: - model: ZeroXClem/Qwen2.5-7B-HomerCreative-Mix layer_range: [0, 28] - sources: - model: ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix layer_range: [0, 28] merge_method: passthrough dtype: bfloat16 ``` ## Usage This model can be used with the standard transformers library: ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("Ro-xe/FMixIA-FrankenMerge-9.5B-PT-9") tokenizer = AutoTokenizer.from_pretrained("Ro-xe/FMixIA-FrankenMerge-9.5B-PT-9") ```