This is the 8-bit quantized version of NousResearch/Hermes-3-Llama-3.1-8B by following the example from the AutoGPTQ repository.

Downloads last month
25
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for ktoprakucar/Hermes-3-Llama-3.1-8B-Q8-GPTQ

Quantized
(207)
this model