Text Generation
Transformers
PyTorch
English
beit3_llava
Inference Endpoints

Error when loading this model

#5
by psp-dada - opened

With the latest transformers version transformers==4.46.2, when I try to load this model by using:

model_name = "openbmb/RLHF-V"

        model = AutoModelForCausalLM.from_pretrained(
            model_name,
            cache_dir=self.model_dir,
            torch_dtype=self.torch_dtype,
            low_cpu_mem_usage=True,
            device_map=self.device,
            attn_implementation="flash_attention_2" if "cuda" in str(self.device) else None,
        ).eval()

An error occured:

ValueError: The checkpoint you are trying to load has model type `beit3_llava` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
psp-dada changed discussion title from Error when load this model to Error when loading this model
psp-dada changed discussion status to closed

Sign up or log in to comment