Exl2 quants for c4ai-command-r7b-12-2024
Automatically quantized using the auto quant script from hf-scripts
BPW:
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model authors have turned it off explicitly.
Model tree for Anthonyg5005/c4ai-command-r7b-12-2024-exl2
Base model
CohereLabs/c4ai-command-r7b-12-2024