Update README.md
Browse files
README.md
CHANGED
@@ -27,7 +27,7 @@ MiMo-7B-RL is a powerful 7B parameter language model developed by Xiaomi, specif
|
|
27 |
- **Context Length**: 32,768 tokens
|
28 |
- **Architecture**: Modified transformer with 36 layers, 32 attention heads
|
29 |
- **Original Format**: SafeTensors
|
30 |
-
- **Converted Format**: GGUF
|
31 |
- **License**: MIT
|
32 |
|
33 |
Key features of the original model:
|
|
|
27 |
- **Context Length**: 32,768 tokens
|
28 |
- **Architecture**: Modified transformer with 36 layers, 32 attention heads
|
29 |
- **Original Format**: SafeTensors
|
30 |
+
- **Converted Format**: GGUF
|
31 |
- **License**: MIT
|
32 |
|
33 |
Key features of the original model:
|