bigstral-12b-32k-8xMoE

Made using mergekit MoE branch with the following config:

base_model: abacusai/bigstral-12b-32k
gate_mode: random 
dtype: bfloat16
experts_per_token: 2
experts:
  - source_model: abacusai/bigstral-12b-32k
    positive_prompts: []
  - source_model: abacusai/bigstral-12b-32k
    positive_prompts: []
  - source_model: abacusai/bigstral-12b-32k
    positive_prompts: []
  - source_model: abacusai/bigstral-12b-32k
    positive_prompts: []
  - source_model: abacusai/bigstral-12b-32k
    positive_prompts: []
  - source_model: abacusai/bigstral-12b-32k
    positive_prompts: []
  - source_model: abacusai/bigstral-12b-32k
    positive_prompts: []
  - source_model: abacusai/bigstral-12b-32k
    positive_prompts: []
Downloads last month
4
Safetensors
Model size
81.5B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for bartowski/bigstral-12b-32k-8xMoE

Finetuned
(916)
this model
Quantizations
2 models