ONNX format of voxreality/nllb-asr-synthetic-robust model

Model inference example:

from transformers import AutoTokenizer
from optimum.onnxruntime import ORTModelForSeq2SeqLM

model_path = "voxreality/nllb-asr-synthetic-robust-onnx"
model = ORTModelForSeq2SeqLM.from_pretrained(model_path, use_cache=False)
tokenizer = AutoTokenizer.from_pretrained(model_path)

src_lang = 'eng_Latn'
tgt_lang = 'deu_Latn'

input_text = "This is a good day"

tokenizer.src_lang = src_lang
inputs = tokenizer(input_text, return_tensors='pt')
model_output = model.generate(**inputs, forced_bos_token_id=tokenizer.lang_code_to_id[tgt_lang])
output_text = tokenizer.batch_decode(model_output, skip_special_tokens=True)[0]

print(output_text)
Downloads last month
3
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support