BlackSheep-MoE-4x3B / generation_config.json
TroyDoesAI's picture
Uses 6.7GB of VRAM Under Inference - Not hard to Get it To Instruct Toxic Behavior, but will not come out on accident anymore.
94d7790 verified
raw
history blame contribute delete
121 Bytes
{
"_from_model_config": true,
"bos_token_id": 1,
"eos_token_id": 32000,
"transformers_version": "4.44.2"
}