kernel

Would it be possible to maintain the torch 2.6 builds for longer?

#1
by hbenoit - opened

axolotl currently defaults to torch 2.6.0, and getting flash-attn to run correctly is proving quite tricky on my setup (getting obscure symbol errors). So defaulting to the kernels builds has been quite useful. However, the code that ran yesterday doesn't run anymore because the builds were removed :)

kernels-community org

Hi @hbenoit ! As a workaround, you can refer to a previous revision using a kernel id such as "kernels-community/flash-attn@56449c1". You need the latest transformers release (v4.55.1), or install from main.

cc @drbh in case it makes sense to keep the 2.6 targets.

Sign up or log in to comment