Model stopped to work
Hello I was trying to use your model and it's no longer working. What happened?
Failed to perform inference: an occurred while streaming the response: Model doesn't exist: CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it
It's a shame you're encountering issues while trying to use the CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it model. To help you identify what's going on and find a solution, I'll need some additional information.
How are you trying to use the model? For instance, are you using a specific platform (like Hugging Face Spaces, Google Colab, etc.) or are you running the code locally?
Which frameworks or libraries are you using? (For example, transformers, pytorch, tensorflow, langchain, etc.)
Could you share the code snippet you're using to interact with the model? This would be extremely helpful in understanding the context of the error.
With this information, I can investigate what happened and provide you with more precise support.
Hey, I believe it is because of your version of transformers... Try using 4.52.4, or the mentioned version in the model's config.