ELYZA-japanese-Llama-2-7b-instruct-GPTQ-4bit-64g
GPTQ model for "elyza/ELYZA-japanese-Llama-2-7b-instruct" : 4bits, gr64, desc_act=True
- Downloads last month
- 96
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.