Updating README and TOS
Browse files- NOTICE.txt +1 -0
- README.md +114 -1
NOTICE.txt
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
Gemma is provided under and subject to the Gemma Terms of Use found at ai.google.dev/gemma/terms
|
README.md
CHANGED
@@ -1,3 +1,116 @@
|
|
1 |
---
|
2 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
library_name: ggml
|
3 |
+
tags:
|
4 |
+
- gguf
|
5 |
+
- quantized
|
6 |
+
- conversational
|
7 |
+
- brazilian-portuguese
|
8 |
+
- portuguese
|
9 |
+
- instruction-tuned
|
10 |
+
base_model: CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it
|
11 |
+
license: gemma
|
12 |
+
language:
|
13 |
+
- pt
|
14 |
+
pipeline_tag: text-generation
|
15 |
+
model_type: gemma3
|
16 |
+
quantized_by: Althayr
|
17 |
---
|
18 |
+
|
19 |
+
# Gemma-3-Gaia-PT-BR-4b-it-GGUF
|
20 |
+
|
21 |
+
This model was converted to GGUF format from [CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it](https://huggingface.co/CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it) using llama.cpp.
|
22 |
+
|
23 |
+
## About GAIA
|
24 |
+
|
25 |
+
**GAIA** is an open, state-of-the-art language model for Brazilian Portuguese. It was developed by continuously pre-training the google/gemma-3-4b-pt model on an extensive, high-quality corpus of Portuguese data. The goal of GAIA is to democratize access to cutting-edge AI technology in Brazil.
|
26 |
+
|
27 |
+
### Original Model Developed by
|
28 |
+
- Brazilian Association of AI (ABRIA)
|
29 |
+
- Center of Excellence in Artificial Intelligence (CEIA-UFG)
|
30 |
+
- Nama
|
31 |
+
- Amadeus AI
|
32 |
+
- Google DeepMind
|
33 |
+
|
34 |
+
## Model Details
|
35 |
+
|
36 |
+
- **Base Model**: [CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it](https://huggingface.co/CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it)
|
37 |
+
- **Original Model**: [google/gemma-3-4b-pt](https://huggingface.co/google/gemma-3-4b-pt)
|
38 |
+
- **Quantization**: BF16 (default)
|
39 |
+
- **Format**: GGUF
|
40 |
+
- **Size**: ~8.5GB
|
41 |
+
- **Parameters**: 4.3B
|
42 |
+
- **Architecture**: Gemma3
|
43 |
+
- **Context Window**: 128K tokens
|
44 |
+
- **Language**: Brazilian Portuguese
|
45 |
+
- **Converted by**: Althayr Nazaret [](https://github.com/althayr) [](https://www.linkedin.com/in/althayr-santos/) [](https://huggingface.co/Althayr)
|
46 |
+
|
47 |
+
## Usage
|
48 |
+
|
49 |
+
### Ollama
|
50 |
+
|
51 |
+
```bash
|
52 |
+
ollama pull https://huggingface.co/Althayr/Gemma-3-Gaia-PT-BR-4b-it-GGUF && \
|
53 |
+
ollama cp huggingface.co/Althayr/Gemma-3-Gaia-PT-BR-4b-it-GGUF Gemma-3-Gaia-PT-BR-4b-it-GGUF && \
|
54 |
+
ollama run Gemma-3-Gaia-PT-BR-4b-it-GGUF
|
55 |
+
```
|
56 |
+
|
57 |
+
### llama.cpp
|
58 |
+
|
59 |
+
#### Installation
|
60 |
+
|
61 |
+
Follow the OS specific instructions at [llama.cpp](https://github.com/ggml-org/llama.cpp?tab=readme-ov-file#quick-start)
|
62 |
+
|
63 |
+
#### CLI Execution
|
64 |
+
```bash
|
65 |
+
llama-cli --hf-repo Althayr/Gemma-3-Gaia-PT-BR-4b-it-GGUF --hf-file gemma-3-gaia-pt-br-4b-it.gguf -p "Me explique brevemente o que é inteligência artificial"
|
66 |
+
```
|
67 |
+
|
68 |
+
#### Server Execution
|
69 |
+
```bash
|
70 |
+
llama-server --hf-repo Althayr/Gemma-3-Gaia-PT-BR-4b-it-GGUF --hf-file gemma-3-gaia-pt-br-4b-it.gguf -c 2048
|
71 |
+
```
|
72 |
+
|
73 |
+
|
74 |
+
## Capabilities
|
75 |
+
|
76 |
+
The model is optimized for Brazilian Portuguese tasks, including:
|
77 |
+
|
78 |
+
- 💬 **Conversation** - Chatbots and virtual assistants
|
79 |
+
- ❓ **Question Answering** - Factual question responses
|
80 |
+
- 📝 **Summarization** - Summarizing long texts
|
81 |
+
- ✍️ **Text Generation** - Creative content creation
|
82 |
+
- 🎯 **Sentiment Analysis** - Emotion analysis in text
|
83 |
+
- 🔍 **Text Understanding** - Document interpretation
|
84 |
+
|
85 |
+
## License and Terms
|
86 |
+
|
87 |
+
This model is provided under and subject to the **Gemma Terms of Use**.
|
88 |
+
By downloading or using this model, you agree to be bound by these terms.
|
89 |
+
|
90 |
+
**Key obligations include:**
|
91 |
+
* Compliance with the [Gemma Prohibited Use Policy](https://ai.google.dev/gemma/prohibited_use_policy).
|
92 |
+
* Providing a copy of the [Gemma Terms of Use](https://ai.google.dev/gemma/terms) to any third-party recipients.
|
93 |
+
* Prominent notice that this is a modified (quantized) version.
|
94 |
+
|
95 |
+
## Citation
|
96 |
+
|
97 |
+
If you use this model in research or applications, please cite the original GAIA paper:
|
98 |
+
|
99 |
+
```bibtex
|
100 |
+
@misc{gaia-gemma-3-4b-2025,
|
101 |
+
title={GAIA: An Open Language Model for Brazilian Portuguese},
|
102 |
+
author={CAMILO-JUNIOR, C. G.; OLIVEIRA, S. S. T.; PEREIRA, L. A.; AMADEUS, M.; FAZZIONI, D.; NOVAIS, A. M. A.; JORDÃO, S. A. A.},
|
103 |
+
year={2025},
|
104 |
+
publisher={Hugging Face},
|
105 |
+
journal={Hugging Face repository},
|
106 |
+
howpublished={\url{https://huggingface.co/CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it}}
|
107 |
+
}
|
108 |
+
```
|
109 |
+
|
110 |
+
## Acknowledgments
|
111 |
+
|
112 |
+
If you use this specific GGUF version, please acknowledge:
|
113 |
+
|
114 |
+
- Original model: CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it
|
115 |
+
- GGUF conversion: Althayr/Gemma-3-Gaia-PT-BR-4b-it-GGUF
|
116 |
+
|