The Omega Directive Qwen3 14B v1.1
Quant by FrenzyBiscuit

AWQ Details
- Model was quantized down to INT4 using GEMM Kernels.
- Zero point quantization
- Group size of 64
- Downloads last month
- 10
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for FrenzyBiscuit/The-Omega-Directive-Qwen3-14B-v1.1-AWQ
Base model
Qwen/Qwen3-14B-Base
Finetuned
Qwen/Qwen3-14B