jbilcke-hf HF Staff commited on
Commit
8f50f73
·
verified ·
1 Parent(s): 347292e

Update requirements.txt

Browse files
Files changed (1) hide show
  1. requirements.txt +4 -1
requirements.txt CHANGED
@@ -20,7 +20,10 @@ numpy==1.24.4
20
  #flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
21
 
22
  # but this one does
23
- flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.0.post2/flash_attn-2.8.0.post2+cu12torch2.7cxx11abiTRUE-cp311-cp311-linux_x86_64.whl
 
 
 
24
 
25
  opencv-python>=4.9.0.80
26
  diffusers==0.31.0
 
20
  #flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
21
 
22
  # but this one does
23
+ #flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.0.post2/flash_attn-2.8.0.post2+cu12torch2.7cxx11abiTRUE-cp311-cp311-linux_x86_64.whl
24
+
25
+ # actually, let's just try this for now:
26
+ flash_attn==2.8.0.post2
27
 
28
  opencv-python>=4.9.0.80
29
  diffusers==0.31.0