configuration_minicpm.py need to be fixed
#9
by
JamePeng2023
- opened
try:
import flash_attn
self._attn_implementation = "flash_attention_2"
except:
self._attn_implementation = "eager"
https://github.com/huggingface/transformers/blob/main/src/transformers/configuration_utils.py#L346
The default value of "_attn_implementation" is "eager". So you don't have to set it as "eager" manually.
JamePeng2023
changed discussion status to
closed