Update README.md
Browse files
README.md
CHANGED
@@ -38,7 +38,7 @@ They have been trained using our pre-trained **BPE tokenizer** with a vocabulary
|
|
38 |
- **[Oute-Dev-1B-Checkpoint-40B](https://huggingface.co/OuteAI/Oute-Dev-1B-Checkpoint-40B)**: Built on the **LLaMa architecture**, trained on approximately **40 billion tokens**.
|
39 |
|
40 |
> [!IMPORTANT]
|
41 |
-
> These models were initially developed for internal testing and did not undergo extensive training.
|
42 |
|
43 |
### Benchmark Performance:
|
44 |
|
|
|
38 |
- **[Oute-Dev-1B-Checkpoint-40B](https://huggingface.co/OuteAI/Oute-Dev-1B-Checkpoint-40B)**: Built on the **LLaMa architecture**, trained on approximately **40 billion tokens**.
|
39 |
|
40 |
> [!IMPORTANT]
|
41 |
+
> These models were initially developed for internal testing and did not undergo extensive training. The output quality will not be suitable for production use or serious applications. You should expect inconsistent, potentially low-quality outputs.
|
42 |
|
43 |
### Benchmark Performance:
|
44 |
|