Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- en
|
4 |
+
base_model:
|
5 |
+
- Qwen/Qwen2.5-Math-7B-Instruct
|
6 |
+
- rombodawg/Rombos-Coder-V2.5-Qwen-7b
|
7 |
+
- rombodawg/Rombos-LLM-V2.5-Qwen-7b
|
8 |
+
---
|
9 |
+
# Utility 19B MoE (3x7B)
|
10 |
+
|
11 |
+
This Mixture-of-Experts model is the combination of the following:
|
12 |
+
|
13 |
+
1. [rombodawg/Rombos-LLM-V2.5-Qwen-7b](https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-7b)
|
14 |
+
2. [rombodawg/Rombos-Coder-V2.5-Qwen-7b](https://huggingface.co/rombodawg/Rombos-Coder-V2.5-Qwen-7b)
|
15 |
+
3. [Qwen/Qwen2.5-Math-7B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Math-7B-Instruct)
|