onnx export

#37
by 25pwn - opened

It seems there is no easy way to export to onnx. Even with some effort I could not get optimum or sentencetransformers to do it.

Qwen/Qwen2.5-VL-3B-Instruct is supported by transformers so I suspect it has to do with the custom model implementation here. Could you share instructions for exporting to onnx?

Jina AI org

Hi @25pwn , you're right, onnx is not yet supported because of the custom implementation. Not sure exactly when but we will most likely implement an onnx-compatible version soon and when it's done I'll let you know.

Sign up or log in to comment