Yoruba Roman Numerals Expert System πŸ‡³πŸ‡¬πŸ”’

This model is fine-tuned from google/byt5-small to translate Roman numerals (e.g. 'i','v','x','XIV, VΜ…M, up to 6000) into Yoruba text.

Example Usage

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("Emeritus-21/yorubanumerals-expertsystem")
model = AutoModelForSeq2SeqLM.from_pretrained("Emeritus-21/yorubanumerals-expertsystem")

inputs = tokenizer("VΜ…M", return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Downloads last month
8
Safetensors
Model size
582M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ 1 Ask for provider support

Model tree for Emeritus-21/yorubanumerals-expertsystem

Base model

google/byt5-small
Finetuned
(91)
this model

Spaces using Emeritus-21/yorubanumerals-expertsystem 2