bigstral-12b-32k-8xMoE
Made using mergekit MoE branch with the following config:
base_model: abacusai/bigstral-12b-32k
gate_mode: random
dtype: bfloat16
experts_per_token: 2
experts:
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- source_model: abacusai/bigstral-12b-32k
positive_prompts: []
- Downloads last month
- 4
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.