MLX Quants of Mawdistical/Mawnia-12B
MLX quants of Mawdistical/Mawnia-12B using mlx-lm for quantization on Apple Silicon.
Quants
Configure mlx
[uv venv] # First-time setup with uv (optional)
[uv] pip install -U mlx-lm
The uv
wrapper is optional but recommended, get it with Homebrew:
brew install uv
Serve an OpenAI-compatible endpoint
[uv run] mlx-lm.server --model soundTeam/Mawnia-12B_mlx-hi_8bpw --temp 1.35 --min-p 0.07
The default URL is http://127.0.0.1:8080/v1
Programmatic usage
from mlx_lm import load, generate
model, tokenizer = load("soundTeam/Mawnia-12B-MLX_hi")
prompt = "hello"
if tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)
Original model card
D1 Mawnipulatine Mansplainer Maxxing
✧ Browse the Mawnia-12B Collection
✧ Recommended Settings
Temp: 1.35, min_p: 0.05
Experiment to find your taste.
✧ Finetuned Set Info
↳ 2.2K Unique Set Scenarios
Entry Composition
↳ Heavy Violence
↳ Dark Themes
↳ Extensive NSFL
↳ Non-lethal but enough to scare people.
↳ Heavy NSFW
↳ Triggered on user action or passively wherever RP is seen
↳ Modern City Scenes
↳ Dystopia Scenes
↳ Grey Line Consent Scenes
↳ Note: Grey line consent is where consent may be mutual but unspoken at times or considered to be accepted by actions either by character or user. May lead to over-possessiveness in some instances.
Creation Composition
↳ Hybrid Organic-Synthetic mix
Token Count
↳ 13.5 M
↳ Note: token count calculated solely on outputs.
Type
↳ Private In-House
Focus
↳ Male Leaning
↳ Anthro
↳ Xeno-Likeness
↳ Passive Negative Bias with mild narcissistic tendencies, matches energy with subtle languidness.
✧ Technical Details
↳ allura-org/Gemma-3-Glitter-12B
↳ A creative writing model based on Gemma 3 12B IT
↳ 50/50 merge of two separate trains
↳ Vision capabilities restored
ToastyPigeon
↳ g3-12b-rp-system-v0.1
↳ ~13.5M tokens of instruct-based RP training (2:1 human to synthetic) with system prompt examples
↳ g3-12b-storyteller-v0.2-textonly
↳ ~20M tokens of completion training on long-form creative writing (1.6M synthetic from R1, remainder human-created)
✧ Credits
↳ @Mawnipulator
Government Bodies
↳ @ArtusDev
↳ @SaisExperiments
↳ ALLURA-ORG
License: CC BY-NC 4.0
Services such as Arli AI and Featherless AI are granted a CC BY-ND 4.0 license for use of this model.
- Downloads last month
- 4