FP8 activation quantization performed with llm-compressor
- Downloads last month
- 73
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
FP8 activation quantization performed with llm-compressor