Ctranslate2 8-bit quantization of original model.
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model’s pipeline type.
Model tree for ctranslate2-4you/DeepSeek-R1-Distill-Qwen-1.5B-ct2-int8
Base model
deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B