4.0 bpw exl3 quant of Qwen/Qwen3-235B-A22B
Runs on 5x24GB with 16k context (Q6).
- Downloads last month
- 20
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support