update readme for card generation

#128
by ariG23498 HF staff - opened

You can now run gguf models from HF directly using ollama.

This PR introduces a README change to the card generation code which outlines the ollama usage.

ggml.ai org

The idea LGTM overall, just need a small change (if I understand correctly):

It should be ollama run hf.co/{model_id} and not new_repo_url because new_repo_url contains all the https://huggingface.co part

Thank you for catching it! I have updated the code as requested.

CC: @pcuenq @reach-vb

ggml.ai org

Cool! Thanks @ariG23498

Let's wait to see if @reach-vb wants to change some wordings or to add other links ;-)

ggml.ai org

I'm not sure we need to explicitly mention this particular consumer of GGUF files, as there are many other downstream solutions out there. Is there a canonical list of consumer apps anywhere? What do you think @reach-vb ?

Cannot merge
This branch has merge conflicts in the following files:
  • app.py

Sign up or log in to comment