Qwen3-0.6B-UQFF / README.md
EricB's picture
EricB HF Staff
Upload model
c989686 verified
metadata
tags:
  - uqff
  - mistral.rs
base_model: Qwen/Qwen3-0.6B
base_model_relation: quantized

Qwen/Qwen3-0.6B, UQFF quantization

Run with mistral.rs. Documentation: UQFF docs.

  1. Flexible ๐ŸŒ€: Multiple quantization formats in one file format with one framework to run them all.
  2. Reliable ๐Ÿ”’: Compatibility ensured with embedded and checked semantic versioning information from day 1.
  3. Easy ๐Ÿค—: Download UQFF models easily and quickly from Hugging Face, or use a local file.
  4. Customizable ๐Ÿ› ๏ธ: Make and publish your own UQFF files in minutes.

Examples

Quantization type(s) Example
AFQ2 ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-afq2-0.uqff
AFQ3 ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-afq3-0.uqff
AFQ4 ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-afq4-0.uqff
AFQ6 ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-afq6-0.uqff
AFQ8 ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-afq8-0.uqff
F8E4M3 ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-f8e4m3-0.uqff
Q2K ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-q2k-0.uqff
Q3K ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-q3k-0.uqff
Q4K ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-q4k-0.uqff
Q5K ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-q5k-0.uqff
Q8_0 ./mistralrs-server -i plain -m EricB/Qwen3-0.6B-UQFF -f qwen30.6b-q8_0-0.uqff