Welcome FalconMamba: The first strong attention-free 7B model
•
102
pip install transformers>=4.39.0 galore-torch
. #ProudlyGpuPoorpip install bitsandbytes>=0.43.0
Exciting release !
use_dora=True
to your LoraConfig
. Find out more about this method here: https://arxiv.org/abs/2402.09353