Flash Attention on Blackwell
#9
by
klkoe
- opened
Since flash attention doesn't support RTX 5080 blackwell architecture yet, can this be started without using flash attention?
Since flash attention doesn't support RTX 5080 blackwell architecture yet, can this be started without using flash attention?