Code World Model (CWM)

This repository contains the MLX conversion of CWM with an 8bit quantization.

mlx_lm.generate --model . --prompt "hello"                                            
==========
Okay, the user just said "hello". I need to respond in a friendly way. Maybe I can ask how they're doing or if there's something I can help them with. Let me think of a few options. I could say "Hello! How are you today?" or "Hi! Is there something I can help you with?" or maybe just "Hi! What's up?" I should choose something that's welcoming and shows I'm ready to chat.
</think>
Hello! How
==========
Prompt: 14 tokens, 17.840 tokens-per-sec
Generation: 100 tokens, 10.276 tokens-per-sec
Peak memory: 34.695 GB
Downloads last month
-
Safetensors
Model size
32.6B params
Tensor type
BF16
·
U32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support