Releases: tridao/flash-attention-wheels
Releases · tridao/flash-attention-wheels
v0.0.16
Add publishing package to PyPI
v0.0.15
Try upgrading pip
v0.0.14
Switch pytorch install to use --index-url
v0.0.13
Only use network if cuda 11.8
v0.0.12
Try network method of installing CUDA
v0.0.11
Use setuptools=68.0.0
v0.0.10
Update setuptools before installing pytorch
v0.0.9
Use newer version of setuptools
v0.0.8
Change url to blah/torch_stable.html
v0.0.7
Fix MATRIX_TORCH_VERSION