|
---
|
|
library_name: ggml
|
|
tags:
|
|
- gguf
|
|
- quantized
|
|
- conversational
|
|
- brazilian-portuguese
|
|
- portuguese
|
|
- instruction-tuned
|
|
base_model: CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it
|
|
license: gemma
|
|
language:
|
|
- pt
|
|
pipeline_tag: text-generation
|
|
model_type: gemma3
|
|
quantized_by: Althayr
|
|
---
|
|
|
|
# Gemma-3-Gaia-PT-BR-4b-it-GGUF
|
|
|
|
This model was converted to GGUF format from [CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it](https://huggingface.co/CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it) using llama.cpp.
|
|
|
|
## About GAIA
|
|
|
|
**GAIA** is an open, state-of-the-art language model for Brazilian Portuguese. It was developed by continuously pre-training the google/gemma-3-4b-pt model on an extensive, high-quality corpus of Portuguese data. The goal of GAIA is to democratize access to cutting-edge AI technology in Brazil.
|
|
|
|
### Original Model Developed by
|
|
- Brazilian Association of AI (ABRIA)
|
|
- Center of Excellence in Artificial Intelligence (CEIA-UFG)
|
|
- Nama
|
|
- Amadeus AI
|
|
- Google DeepMind
|
|
|
|
## Model Details
|
|
|
|
- **Base Model**: [CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it](https://huggingface.co/CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it)
|
|
- **Original Model**: [google/gemma-3-4b-pt](https://huggingface.co/google/gemma-3-4b-pt)
|
|
- **Quantization**: BF16 (default)
|
|
- **Format**: GGUF
|
|
- **Size**: ~8.5GB
|
|
- **Parameters**: 4.3B
|
|
- **Architecture**: Gemma3
|
|
- **Context Window**: 128K tokens
|
|
- **Language**: Brazilian Portuguese
|
|
- **Converted by**: Althayr Nazaret [](https://github.com/althayr) [](https://www.linkedin.com/in/althayr-santos/) [](https://huggingface.co/Althayr)
|
|
|
|
## Usage
|
|
|
|
### Ollama
|
|
|
|
```bash
|
|
ollama pull https://huggingface.co/Althayr/Gemma-3-Gaia-PT-BR-4b-it-GGUF && \
|
|
ollama cp huggingface.co/Althayr/Gemma-3-Gaia-PT-BR-4b-it-GGUF Gemma-3-Gaia-PT-BR-4b-it-GGUF && \
|
|
ollama run Gemma-3-Gaia-PT-BR-4b-it-GGUF
|
|
```
|
|
|
|
### llama.cpp
|
|
|
|
#### Installation
|
|
|
|
Follow the OS specific instructions at [llama.cpp](https://github.com/ggml-org/llama.cpp?tab=readme-ov-file#quick-start)
|
|
|
|
#### CLI Execution
|
|
```bash
|
|
llama-cli --hf-repo Althayr/Gemma-3-Gaia-PT-BR-4b-it-GGUF --hf-file gemma-3-gaia-pt-br-4b-it.gguf -p "Me explique brevemente o que é inteligência artificial"
|
|
```
|
|
|
|
#### Server Execution
|
|
```bash
|
|
llama-server --hf-repo Althayr/Gemma-3-Gaia-PT-BR-4b-it-GGUF --hf-file gemma-3-gaia-pt-br-4b-it.gguf -c 2048
|
|
```
|
|
|
|
|
|
## Capabilities
|
|
|
|
The model is optimized for Brazilian Portuguese tasks, including:
|
|
|
|
- 💬 **Conversation** - Chatbots and virtual assistants
|
|
- ❓ **Question Answering** - Factual question responses
|
|
- 📝 **Summarization** - Summarizing long texts
|
|
- ✍️ **Text Generation** - Creative content creation
|
|
- 🎯 **Sentiment Analysis** - Emotion analysis in text
|
|
- 🔍 **Text Understanding** - Document interpretation
|
|
|
|
## License and Terms
|
|
|
|
This model is provided under and subject to the **Gemma Terms of Use**.
|
|
By downloading or using this model, you agree to be bound by these terms.
|
|
|
|
**Key obligations include:**
|
|
* Compliance with the [Gemma Prohibited Use Policy](https://ai.google.dev/gemma/prohibited_use_policy).
|
|
* Providing a copy of the [Gemma Terms of Use](https://ai.google.dev/gemma/terms) to any third-party recipients.
|
|
* Prominent notice that this is a modified (quantized) version.
|
|
|
|
## Citation
|
|
|
|
If you use this model in research or applications, please cite the original GAIA paper:
|
|
|
|
```bibtex
|
|
@misc{gaia-gemma-3-4b-2025,
|
|
title={GAIA: An Open Language Model for Brazilian Portuguese},
|
|
author={CAMILO-JUNIOR, C. G.; OLIVEIRA, S. S. T.; PEREIRA, L. A.; AMADEUS, M.; FAZZIONI, D.; NOVAIS, A. M. A.; JORDÃO, S. A. A.},
|
|
year={2025},
|
|
publisher={Hugging Face},
|
|
journal={Hugging Face repository},
|
|
howpublished={\url{https://huggingface.co/CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it}}
|
|
}
|
|
```
|
|
|
|
## Acknowledgments
|
|
|
|
If you use this specific GGUF version, please acknowledge:
|
|
|
|
- Original model: CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it
|
|
- GGUF conversion: Althayr/Gemma-3-Gaia-PT-BR-4b-it-GGUF
|
|
|
|
|