pablo-rf commited on
Commit
29e4bd8
verified
1 Parent(s): 98c9f7f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -15,7 +15,7 @@ pipeline_tag: text-generation
15
  library_name: transformers
16
  ---
17
 
18
- # Llama-Carvalho-HQ
19
 
20
  ## Table of Contents
21
  <details>
@@ -41,13 +41,13 @@ library_name: transformers
41
 
42
  ## Model description
43
 
44
- **Llama-Carvalho-HQ** is a 8B-parameter transformer-based causal language model for Galician, Portuguese, Spanish and English.
45
  It is the result of a continual pretraining of [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) with a multilingual corpus consisting of 540M tokens of plain text and 72M tokens of instructions (formated as plain text)
46
 
47
  This model is part of the **Carvalho familily**, a family of LLMs specialized in Portuguese and Galician. Smaller models can be found [here](https://huggingface.co/Nos-PT/Carvalho_pt-gl-1.3B)
48
  ## Intended uses and limitations
49
 
50
- The **Llama-Carvalho-HQ** model is ready-to-use only for causal language modeling.
51
  It can perform text-generation tasks and be fine-tuned for specific scenarios.
52
 
53
  ## How to use
@@ -57,7 +57,7 @@ from transformers import pipeline, AutoTokenizer, AutoModelForCausalLM
57
 
58
  input_text = "Hoxe fai un bo d铆a. O sol "
59
 
60
- model_id = "Nos-PT/Llama-Carvalho-HQ"
61
  tokenizer = AutoTokenizer.from_pretrained(model_id)
62
  model = AutoModelForCausalLM.from_pretrained(model_id)
63
  generator = pipeline(
 
15
  library_name: transformers
16
  ---
17
 
18
+ # Llama-Carvalho-PT-GL
19
 
20
  ## Table of Contents
21
  <details>
 
41
 
42
  ## Model description
43
 
44
+ **Llama-Carvalho-PT-GL** is a 8B-parameter transformer-based causal language model for Galician, Portuguese, Spanish and English.
45
  It is the result of a continual pretraining of [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) with a multilingual corpus consisting of 540M tokens of plain text and 72M tokens of instructions (formated as plain text)
46
 
47
  This model is part of the **Carvalho familily**, a family of LLMs specialized in Portuguese and Galician. Smaller models can be found [here](https://huggingface.co/Nos-PT/Carvalho_pt-gl-1.3B)
48
  ## Intended uses and limitations
49
 
50
+ The **Llama-Carvalho-PT-GL** model is ready-to-use only for causal language modeling.
51
  It can perform text-generation tasks and be fine-tuned for specific scenarios.
52
 
53
  ## How to use
 
57
 
58
  input_text = "Hoxe fai un bo d铆a. O sol "
59
 
60
+ model_id = "Nos-PT/Llama-Carvalho-PT-GL"
61
  tokenizer = AutoTokenizer.from_pretrained(model_id)
62
  model = AutoModelForCausalLM.from_pretrained(model_id)
63
  generator = pipeline(