This is a .fp8 conversion of the transformer found at alibaba-pai/EasyAnimateV5.1-12b-zh
As this is a quantized model not a finetune, all the same restrictions/original license terms still apply.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for pollockjj/easyanimatev5.1-12b-zh
Base model
alibaba-pai/EasyAnimateV5.1-12b-zh