--- language: en tags: - b1ade license: mit widget: - text: |- context: question: answer: < example_title: Math - text: >- context: question: answer: < example_title: Sentiment - inference: - parameters: - max_new_tokens: 512 - top_p=0.99 datasets: - Open-Orca/OpenOrca - WizardLM/WizardLM_evol_instruct_V2_196k --- # B1ade Stable revision: ``` from transformers import AutoTokenizer model = AutoModelForCausalLM.from_pretrained("w601sxs/b1ade-1b", torch_dtype=torch.bfloat16, device_map="auto", revision='b4b0fd71589e6590089e1ec14a840ecab10894ae') ```