GPTQ 4 bit 128 groupsize quantization of https://huggingface.co/digitous/13B-HyperMantis

Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using digitous/13B-HyperMantis_GPTQ_4bit-128g 1