File size: 306 Bytes
fedbb19
 
 
 
 
d59f323
fedbb19
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
torch==2.3.1
torchvision==0.18.1
transformers==4.42.3
opencv-python-headless<4.10
peft<0.14.0
timm==1.0.9
einops==0.8.0
#flash_attn
https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
sentencepiece==0.2.0
mmengine<1