EXL3 Quants of alpindale/goliath-120b

EXL3 quants of alpindale/goliath-120b using exllamav3 for quantization.

Quants

Quant(Revision) Bits per Weight Head Bits
1.8_H5 1.8 5
2.0_H4 2.0 4

Downloading quants with huggingface-cli

Click to view download instructions

Install hugginface-cli:

pip install -U "huggingface_hub[cli]"

Download quant by targeting the specific quant revision (branch):

huggingface-cli download ArtusDev/alpindale_goliath-120b-EXL3 --revision "5.0bpw_H6" --local-dir ./
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ArtusDev/alpindale_goliath-120b-EXL3

Quantized
(6)
this model