Text Generation
GGUF
English
mixture of experts
Mixture of Experts
8x3B
Llama 3.2 MOE
128k context
creative
creative writing
fiction writing
plot generation
sub-plot generation
story generation
scene continue
storytelling
fiction story
science fiction
romance
all genres
story
writing
vivid prosing
vivid writing
fiction
roleplaying
bfloat16
swearing
rp
horror
mergekit
conversational
Update README.md
Browse files
README.md
CHANGED
@@ -34,7 +34,7 @@ tags:
|
|
34 |
pipeline_tag: text-generation
|
35 |
---
|
36 |
|
37 |
-
<B><font color="red">WARNING:</font> NSFW. Vivid prose. INTENSE. Visceral Details.
|
38 |
|
39 |
<h2>Llama-3.2-8X3B-MOE-Dark-Champion-Instruct-uncensored-abliterated-18.4B-GGUF</h2>
|
40 |
|
@@ -70,12 +70,11 @@ Several outputs below, including 2, 4 and 8 experts used.
|
|
70 |
- Role-players: Careful raising temp too high as it may affect instruction following.
|
71 |
- This model works with rep pen of 1 or higher, 1.02+ recommended.
|
72 |
- If you want a specific type of prose (IE horror) add in "(vivid horror)" or "(graphic vivid horror)" (no quotes) in your prompt(s).
|
73 |
-
- A lot of GPTisms have been removed. There are still a few however - errrrr.
|
74 |
-
- This is not a "happy ever after" model. It has a negative bias.
|
75 |
-
- Output length will vary however this model prefers
|
76 |
- For creative uses, different quants will produce slightly different output.
|
77 |
- Due to the high stability and compressed nature of this model, all quants will operate at above average levels.
|
78 |
-
- If you use rope to extend context, increase temp AND instructions detail levels to compensate for "rope issues".
|
79 |
- Source code for this model and Imatrix GGUFs versions will be uploaded shortly at separate repos.
|
80 |
|
81 |
<B>Meet the Team: Mixture of Experts Models</b>
|
|
|
34 |
pipeline_tag: text-generation
|
35 |
---
|
36 |
|
37 |
+
<B><font color="red">WARNING:</font> NSFW. Vivid prose. INTENSE. Visceral Details. Light HORROR. Swearing. UNCENSORED... humor, romance, fun. </B>
|
38 |
|
39 |
<h2>Llama-3.2-8X3B-MOE-Dark-Champion-Instruct-uncensored-abliterated-18.4B-GGUF</h2>
|
40 |
|
|
|
70 |
- Role-players: Careful raising temp too high as it may affect instruction following.
|
71 |
- This model works with rep pen of 1 or higher, 1.02+ recommended.
|
72 |
- If you want a specific type of prose (IE horror) add in "(vivid horror)" or "(graphic vivid horror)" (no quotes) in your prompt(s).
|
73 |
+
- A lot of GPTisms have been removed. There are still a few however - errrrr. Higher "temps" will help with this issue.
|
74 |
+
- This is not a "happy ever after" model but it is also not "horror". It has a light negative bias.
|
75 |
+
- Output length will vary however this model prefers slightly longer outputs unless you state the size.
|
76 |
- For creative uses, different quants will produce slightly different output.
|
77 |
- Due to the high stability and compressed nature of this model, all quants will operate at above average levels.
|
|
|
78 |
- Source code for this model and Imatrix GGUFs versions will be uploaded shortly at separate repos.
|
79 |
|
80 |
<B>Meet the Team: Mixture of Experts Models</b>
|