This is an EXL2 quant (5.0bpw H8) of Sao10K/Llama-3.3-70B-Vulpecula-r1.

EXL2 Quants by ArtusDev.

Downloads last month
3
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for ArtusDev/Llama-3.3-70B-Vulpecula-r1_EXL2_5.0bpw_H8

Collection including ArtusDev/Llama-3.3-70B-Vulpecula-r1_EXL2_5.0bpw_H8