Releases: tridao/flash-attention-wheels
Releases · tridao/flash-attention-wheels
v2.0.8
Use torch 2.1.0.dev20230613 instead of new one
v2.0.7
Bump to v2.0.7
v0.0.26
Always clone cutlass
v0.0.25
Fix syntax in matrix.cuda-version checking
v0.0.22
Exclude cuda 12.1 for torch <= 2.0
v0.0.21
Try torch 2.1dev
v0.0.20
Enable more python versions
v0.0.19
Change package name to upload to PyPI
v0.0.18
Install lit before pytorch
v0.0.17
Move installing setuptools to after torch install