These are some GGUFs we created to make this model more portable. You can run in ollama via the following:
create a file named Modelfile (no extension) and put the following: FROM {gguf file}.gguf then exit, and run in terminal: cd {folder with gguf model} ollama create -f Modelfile
- Downloads last month
- 4
Hardware compatibility
Log In
to view the estimation
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support