EXL2 quants of GLM-4-9B-0414

4.5 bits per weight
6.0 bits per weight

Model Size
4.5 bpw 6.32 GB
6.0 bpw 7.85 GB
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for adriabama06/GLM-4-9B-0414-exl2

Finetuned
(6)
this model