Quantization support?

#2
by Corianas - opened

Hi, Do you know if this would support the model_q8f16.onnx in the same directory the model is already loading from and see if this enhances things even further?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment