safe_serialization

#105
by cuongdk253 - opened

When I finish fine-tuning the model, I want to save the full model. I tried using the safe_serialization attribute, but it reported the following error:
This indicates that the model has been quantized, but it is not serializable. The warning advises checking the logger for more details to understand why serialization fails.

Is there a way to save the full model despite this issue?
IMG_2127.jpeg

Sign up or log in to comment