Deprecation warning
Browse files
README.md
CHANGED
@@ -23,7 +23,7 @@ datasets:
|
|
23 |
</br>
|
24 |
<img align="left" width="40" height="40" src="https://github.githubassets.com/images/icons/emoji/unicode/1f917.png">
|
25 |
<p style="text-align: center;"> This is the model card for Gervásio 7B PTPT Decoder.
|
26 |
-
You may be interested in some of the other models in the <a href="https://huggingface.co/PORTULAN">Albertina (encoders)
|
27 |
</p>
|
28 |
</br>
|
29 |
</br>
|
@@ -32,9 +32,14 @@ datasets:
|
|
32 |
|
33 |
</br>
|
34 |
|
35 |
-
|
|
|
|
|
36 |
|
37 |
|
|
|
|
|
|
|
38 |
It is a **decoder** of the LLaMA family, based on the neural architecture Transformer and developed over the LLaMA-2 7B model.
|
39 |
Its further improvement through additional training was done over language resources that include new instruction data sets of Portuguese prepared for this purpose ([extraGLUE-Instruct
|
40 |
](https://huggingface.co/datasets/PORTULAN/extraglue-instruct)).
|
@@ -156,4 +161,5 @@ grant PINFRA/22117/2016; research project GPT-PT - Transformer-based Decoder for
|
|
156 |
grant CPCA-IAC/AV/478395/2022; innovation project
|
157 |
ACCELERAT.AI - Multilingual Intelligent Contact Centers, funded by IAPMEI, I.P. - Agência para a Competitividade e Inovação
|
158 |
under the grant C625734525-00462629, of Plano de Recuperação e Resiliência,
|
159 |
-
call RE-C05-i01.01 – Agendas/Alianças Mobilizadoras para a Reindustrialização.
|
|
|
|
23 |
</br>
|
24 |
<img align="left" width="40" height="40" src="https://github.githubassets.com/images/icons/emoji/unicode/1f917.png">
|
25 |
<p style="text-align: center;"> This is the model card for Gervásio 7B PTPT Decoder.
|
26 |
+
You may be interested in some of the other models in the <a href="https://huggingface.co/PORTULAN">Albertina (encoders), Gervásio (decoders) and Serafim (sentence encoder) families</a>.
|
27 |
</p>
|
28 |
</br>
|
29 |
</br>
|
|
|
32 |
|
33 |
</br>
|
34 |
|
35 |
+
This model has been **deprecated**.
|
36 |
+
|
37 |
+
We recommend you use the improved [**gervasio-8b-portuguese-ptpt-decoder**](https://huggingface.co/PORTULAN/gervasio-8b-portuguese-ptpt-decoder).
|
38 |
|
39 |
|
40 |
+
<!--
|
41 |
+
**Gervásio PT*** is a **fully open** decoder for the **Portuguese language**.
|
42 |
+
|
43 |
It is a **decoder** of the LLaMA family, based on the neural architecture Transformer and developed over the LLaMA-2 7B model.
|
44 |
Its further improvement through additional training was done over language resources that include new instruction data sets of Portuguese prepared for this purpose ([extraGLUE-Instruct
|
45 |
](https://huggingface.co/datasets/PORTULAN/extraglue-instruct)).
|
|
|
161 |
grant CPCA-IAC/AV/478395/2022; innovation project
|
162 |
ACCELERAT.AI - Multilingual Intelligent Contact Centers, funded by IAPMEI, I.P. - Agência para a Competitividade e Inovação
|
163 |
under the grant C625734525-00462629, of Plano de Recuperação e Resiliência,
|
164 |
+
call RE-C05-i01.01 – Agendas/Alianças Mobilizadoras para a Reindustrialização.
|
165 |
+
-->
|