File size: 4,196 Bytes
a33b076
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e1a6125
a33b076
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
---
library_name: ggml
tags:
- gguf
- quantized
- conversational
- brazilian-portuguese
- portuguese
- instruction-tuned
base_model: CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it
license: gemma
language:
- pt
pipeline_tag: text-generation
model_type: gemma3
quantized_by: Althayr
---

# Gemma-3-Gaia-PT-BR-4b-it-Q8_0-GGUF

This model was converted to GGUF format from [CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it](https://huggingface.co/CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it) using llama.cpp.

## About GAIA

**GAIA** is an open, state-of-the-art language model for Brazilian Portuguese. It was developed by continuously pre-training the google/gemma-3-4b-pt model on an extensive, high-quality corpus of Portuguese data. The goal of GAIA is to democratize access to cutting-edge AI technology in Brazil.

### Original Model Developed by
- Brazilian Association of AI (ABRIA)
- Center of Excellence in Artificial Intelligence (CEIA-UFG)
- Nama
- Amadeus AI  
- Google DeepMind

## Model Details

- **Base Model**: [CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it](https://huggingface.co/CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it)
- **Original Model**: [google/gemma-3-4b-pt](https://huggingface.co/google/gemma-3-4b-pt)
- **Quantization**: Q8_0 (8-bit)
- **Format**: GGUF
- **Size**: ~4.13 GB (from the original ~8.5GB)
- **Parameters**: 4.3B
- **Architecture**: Gemma3
- **Context Window**: 128K tokens
- **Language**: Brazilian Portuguese
- **Converted by**: Althayr Nazaret [![GitHub](https://img.shields.io/badge/GitHub-100000?style=flat&logo=github&logoColor=white)](https://github.com/althayr) [![LinkedIn](https://img.shields.io/badge/LinkedIn-0077B5?style=flat&logo=linkedin&logoColor=white)](https://www.linkedin.com/in/althayr-santos/) [![HuggingFace](https://img.shields.io/badge/🤗_Hugging_Face-FFD21E?style=flat)](https://huggingface.co/Althayr)

## Usage

### Ollama

```bash
ollama pull https://huggingface.co/Althayr/Gemma-3-Gaia-PT-BR-4b-it-Q8_0-GGUF && \
ollama cp huggingface.co/Althayr/Gemma-3-Gaia-PT-BR-4b-it-Q8_0-GGUF Gemma-3-Gaia-PT-BR-4b-it-Q8_0 && \
ollama run Gemma-3-Gaia-PT-BR-4b-it-Q8_0
```

### llama.cpp

#### Installation

Follow the OS specific instructions at [llama.cpp](https://github.com/ggml-org/llama.cpp?tab=readme-ov-file#quick-start)

#### CLI Execution
```bash
llama-cli --hf-repo Althayr/Gemma-3-Gaia-PT-BR-4b-it-Q8_0-GGUF --hf-file gemma-3-gaia-pt-br-4b-it-q8_0.gguf -p "Me explique brevemente o que é inteligência artificial"
```

#### Server Execution
```bash
llama-server --hf-repo Althayr/Gemma-3-Gaia-PT-BR-4b-it-Q8_0-GGUF --hf-file gemma-3-gaia-pt-br-4b-it-q8_0.gguf -c 2048
```


## Capabilities

The model is optimized for Brazilian Portuguese tasks, including:

- 💬 **Conversation** - Chatbots and virtual assistants
-**Question Answering** - Factual question responses
- 📝 **Summarization** - Summarizing long texts
- ✍️ **Text Generation** - Creative content creation
- 🎯 **Sentiment Analysis** - Emotion analysis in text
- 🔍 **Text Understanding** - Document interpretation

## License and Terms

This model is provided under and subject to the **Gemma Terms of Use**.
By downloading or using this model, you agree to be bound by these terms.

**Key obligations include:**
*   Compliance with the [Gemma Prohibited Use Policy](https://ai.google.dev/gemma/prohibited_use_policy).
*   Providing a copy of the [Gemma Terms of Use](https://ai.google.dev/gemma/terms) to any third-party recipients.
*   Prominent notice that this is a modified (quantized) version.

## Citation

If you use this model in research or applications, please cite the original GAIA paper:

```bibtex
@misc{gaia-gemma-3-4b-2025,
  title={GAIA: An Open Language Model for Brazilian Portuguese},
  author={CAMILO-JUNIOR, C. G.; OLIVEIRA, S. S. T.; PEREIRA, L. A.; AMADEUS, M.; FAZZIONI, D.; NOVAIS, A. M. A.; JORDÃO, S. A. A.},
  year={2025},
  publisher={Hugging Face},
  journal={Hugging Face repository},
  howpublished={\url{https://huggingface.co/CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it}}
}
```

## Acknowledgments

If you use this specific GGUF version, please acknowledge:

- Original model: CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it
- GGUF conversion: Althayr/Gemma-3-Gaia-PT-BR-4b-it-Q8_0-GGUF