Skip to content

Releases: tridao/flash-attention-wheels

v2.0.8

11 Aug 21:56
Compare
Choose a tag to compare
Use torch 2.1.0.dev20230613 instead of new one

v2.0.7

11 Aug 19:43
Compare
Choose a tag to compare
Bump to v2.0.7

v0.0.26

11 Aug 19:03
Compare
Choose a tag to compare
Always clone cutlass

v0.0.25

11 Aug 04:29
Compare
Choose a tag to compare
Fix syntax in matrix.cuda-version checking

v0.0.22

11 Aug 04:17
Compare
Choose a tag to compare
Exclude cuda 12.1 for torch <= 2.0

v0.0.21

11 Aug 04:14
Compare
Choose a tag to compare
Try torch 2.1dev

v0.0.20

11 Aug 03:55
Compare
Choose a tag to compare
Enable more python versions

v0.0.19

11 Aug 00:18
Compare
Choose a tag to compare
Change package name to upload to PyPI

v0.0.18

10 Aug 23:57
Compare
Choose a tag to compare
Install lit before pytorch

v0.0.17

10 Aug 23:48
Compare
Choose a tag to compare
Move installing setuptools to after torch install