spkshumway commited on
Commit
dc813be
·
verified ·
1 Parent(s): 532dc45

Adding pip install bitsandbytes for Colab and Kaggle compatibility

Browse files

We mention this model in our Colab + Hugging Face integration blog ( https://medium.com/google-colab/launch-hugging-face-models-in-colab-for-faster-ai-exploration-bee261978cf9 ) but there's no bitsandbytes pre-installed in Colab (will add sometime next week) so this will fail out of the box for anyone trying to use this in Colab and Kaggle. In the. mean time, adding the "!pip install -U bitsandbytes --no-deps" at the top makes this work out of the box in both Kaggle and Colab.

Files changed (1) hide show
  1. notebook.ipynb +1 -0
notebook.ipynb CHANGED
@@ -112,6 +112,7 @@
112
  }
113
  ],
114
  "source": [
 
115
  "from transformers import AutoModelForCausalLM, AutoTokenizer\n",
116
  "\n",
117
  "MODEL_NAME = 'NousResearch/Genstruct-7B'\n",
 
112
  }
113
  ],
114
  "source": [
115
+ "!pip install -U bitsandbytes --no-deps\n",
116
  "from transformers import AutoModelForCausalLM, AutoTokenizer\n",
117
  "\n",
118
  "MODEL_NAME = 'NousResearch/Genstruct-7B'\n",