Doesn't work.
#276
by
PacmanGraphics
- opened
Following instructions from:
https://towardsdatascience.com/run-bloom-the-largest-open-access-ai-model-on-your-desktop-computer-f48e1e2a9a32
Just keeps hanging on line:
input_ids = tokenizer.encode(input_sentence, return_tensors='pt').to(device)
Consuming processor and memory.
See images: https://huggingface.co/bigscience/bloom/discussions/194#660ce838188ee489f0bde62a
This comment has been hidden