Not able to load model while finetuning

#7
by abpani1994 - opened

It generated below error

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:3 and cuda:0!

bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch_dtype,
bnb_4bit_use_double_quant=True,
# bnb_4bit_quant_storage=torch.bfloat16,
)

model = AutoModelForCausalLM.from_pretrained(model_id, quantization_config = bnb_config, device_map = 'auto', attn_implementation=attn_implementation, torch_dtype=torch_dtype)

abpani1994 changed discussion title from Not able load model while finetuning to Not able to load model while finetuning

Sign up or log in to comment