fastllm model for Qweb-7B-Chat-int4
Github address: https://github.com/ztxz16/fastllm
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.