Releases: tridao/flash-attention-wheels
Releases · tridao/flash-attention-wheels
v2.0.8.post7
Don't clear disk space
v2.0.8.post6
Try manylinux container
v2.0.8.post5
Clear disk space
v2.0.8.post4
Use CUDA network install
v2.0.8.post3
Use ubuntu-20.04 runner
v2.0.8.post12
Compile all headdims
v2.0.8.post11
Convert cxx11_abi to string
v2.0.8.post10
Try to compile for cxx11 ABI as well
v2.0.8.post2
Bump version
v2.0.8.post1
Fix exclude new pytorch version