winglian commited on
Commit
ffac902
·
unverified ·
1 Parent(s): 15f6e57

bump flash-attn to 2.0.4 for the base docker image (#382)

Browse files
Files changed (1) hide show
  1. docker/Dockerfile-base +1 -1
docker/Dockerfile-base CHANGED
@@ -40,7 +40,7 @@ ARG TORCH_CUDA_ARCH_LIST="7.0 7.5 8.0 8.6 9.0+PTX"
40
 
41
  RUN git clone https://github.com/Dao-AILab/flash-attention.git && \
42
  cd flash-attention && \
43
- git checkout v2.0.1 && \
44
  python3 setup.py bdist_wheel && \
45
  cd csrc/fused_dense_lib && \
46
  python3 setup.py bdist_wheel && \
 
40
 
41
  RUN git clone https://github.com/Dao-AILab/flash-attention.git && \
42
  cd flash-attention && \
43
+ git checkout v2.0.4 && \
44
  python3 setup.py bdist_wheel && \
45
  cd csrc/fused_dense_lib && \
46
  python3 setup.py bdist_wheel && \