vaibhavpandeyvpz commited on
Commit
e1fa59a
·
1 Parent(s): 3f66ae1

Install flash-attn using wheel

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -17,4 +17,4 @@ huggingface-hub
17
  Pillow
18
  numpy>=1.23.5,<2
19
  einops
20
-
 
17
  Pillow
18
  numpy>=1.23.5,<2
19
  einops
20
+ https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.18/flash_attn-2.7.4+cu128torch2.8-cp310-cp310-linux_x86_64.whl