Issue when running inference with the 4B model
Hi,
Thanks a lot for sharing the model,
I'm trying to make it work from the Quick start code shared on the HF page, but I'm facing an issue. Here is the error I get:
/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py in _sample(self, input_ids, logits_processor, stopping_criteria, generation_config, synced_gpus, streamer, **model_kwargs)
3249
3250 if is_prefill:
-> 3251 outputs = self(**model_inputs, return_dict=True)
3252 is_prefill = False
3253 else:
TypeError: Qwen2ForCausalLM(
(model): Qwen2Model(
(embed_tokens): Embedding(151679, 2048)
(layers): ModuleList(
(0-35): 36 x Qwen2DecoderLayer(
(self_attn): Qwen2FlashAttention2(
(q_proj): Linear(in_features=2048, out_features=2048, bias=True)
(k_proj): Linear(in_features=2048, out_features=256, bias=True)
(v_proj): Linear(in_features=2048, out_features=256, bias=True)
(o_proj): Linear(in_features=2048, out_features=2048, bias=False)
(rotary_emb): Qwen2RotaryEmbedding()
)
(mlp): Qwen2MLP(
(gate_proj): Linear(in_features=2048, out_features=11008, bias=False)
(up_proj): Linear(in_features=2048, out_features=11008, bias=False)
(down_proj): Linear(in_features=11008, out_features=2048, bias=False)
(act_fn): SiLU()
)
(input_layernorm): Qwen2RMSNorm((2048,), eps=1e-06)
(post_attention_layernorm): Qwen2RMSNorm((2048,), eps=1e-06)
)
)
(norm): Qwen2RMSNorm((2048,), eps=1e-06)
(rotary_emb): Qwen2RotaryEmbedding()
)
(lm_head): Linear(in_features=2048, out_features=151679, bias=False)
) got multiple values for keyword argument 'return_dict'
It seems there is a problem with Qwen.
I work on colab enterprise with 2 A100 GPUs.
Any idea that could help to solve the problem ?
Thanks in advance
I get ImportError: cannot import name 'Qwen2Config' from 'transformers'
, did you have any issues with this?
I get
ImportError: cannot import name 'Qwen2Config' from 'transformers'
, did you have any issues with this?
Thanks for your interest in our work. Please refer https://github.com/magic-research/Sa2VA/blob/main/requirements.txt for the dependency versions. The transformers lib changes rapidly, so please make the version exactly the same.