This model was converted to the OpenVINO IR format using the following command:

optimum-cli export openvino -m "{path}Dolphin3.0-R1-Mistral-24B" --task text-generation-with-past --weight-format int4 --ratio 1 --group-size 128 --dataset wikitext2 --disable-stateful --all-layers --awq --scale-estimation "{path}Dolphin3.0-R1-Mistral-24B-int4_asym-awq-se-ns-ov"
Downloads last month
13
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Collection including Echo9Zulu/Dolphin3.0-R1-Mistral-24B-int4_asym-awq-se-ns-ov