mamba-1.4b-aquila-400b / config.json
ldwang's picture
Upload config.json with huggingface_hub
65275f6 verified
raw
history blame
187 Bytes
{
"d_model": 2048,
"fused_add_norm": true,
"n_layer": 48,
"pad_vocab_size_multiple": 1,
"residual_in_fp32": true,
"rms_norm": false,
"ssm_cfg": {},
"vocab_size": 100008
}