Model Export to ONNX format
#32
by
Desjajja
- opened
Is there any pipeline to export this model to ONNX (using torch.onnx/optimum, etc)? I intend to do further acceleration in inference and ONNX file is a must. However, none of the frameworks above support baichuan yet.
I have no idea.Is there any framework supports llama?