Update README.md
Browse files
README.md
CHANGED
@@ -46,7 +46,8 @@ Less repetitive (though it depends on your own prompt and settings).
|
|
46 |
I have tested with 49444/65536 tokens no degradation although i notice it's actually learning the context better and it's impacting the output a lot.
|
47 |
(what i don't like is, it's learning the previous context(of turns) too quickly and set it as new standards.).
|
48 |
|
49 |
-
|
50 |
This model was merged using the TIES merge method using ZeroAgency/Mistral-Small-3.1-24B-Instruct-2503-hf as a base.
|
51 |
-
|
52 |
-
|
|
|
|
46 |
I have tested with 49444/65536 tokens no degradation although i notice it's actually learning the context better and it's impacting the output a lot.
|
47 |
(what i don't like is, it's learning the previous context(of turns) too quickly and set it as new standards.).
|
48 |
|
49 |
+
|
50 |
This model was merged using the TIES merge method using ZeroAgency/Mistral-Small-3.1-24B-Instruct-2503-hf as a base.
|
51 |
+
Models Merged:
|
52 |
+
- PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
|
53 |
+
- Gryphe/Pantheon-RP-1.8-24b-Small-3.1
|