Can you provide command-a 2025 gptq?
#9 opened 26 days ago
by
MRU4913
Please provide onnx model along with inference script.
#7 opened 5 months ago
by
SantoshHF
VRAM requirement
3
#5 opened about 1 year ago
by
jithinmukundan
Can VLLM be used for loading?
6
#4 opened about 1 year ago
by
wawoshashi
How many bits and what is the groupsize?
1
#3 opened about 1 year ago
by
vitvit
What library was used to quantize the model?
1
#2 opened about 1 year ago
by
KirillR
How to load command r+ in text-generation-webui?
5
#1 opened about 1 year ago
by
MLDataScientist