Inconsistent torch_dtype
#2
by
dtthanh
- opened
In How to Use:
But in config.json:
"torch_dtype": "float32"
What is the torch data type the model use?
In How to Use:
torch.bfloat16
model = AutoModelForCausalLM.from_pretrained("NumbersStation/nsql-llama-2-7B", torch_dtype=torch.bfloat16)
Please use torch.bfloat16
senwu
changed discussion status to
closed