YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

English-Yoruba STEM Translation Model

This model is trained to translate English STEM content to Yoruba.

Model Details

  • Architecture: Transformer-based sequence-to-sequence model
  • Base Model: Davlan/mt5-base-en-yor-mt
  • Training Data: YorubaSTEM1.0
  • Performance: BLEU: 36.08

Usage

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("gbelewade/YorubaSTEMt5") model = AutoModelForSeq2SeqLM.from_pretrained("gbelewade/YorubaSTEMt5")

Translate English text to Yoruba

english_text = "The chemical formula for water is H2O." inputs = tokenizer(english_text, return_tensors="pt") outputs = model.generate(**inputs) yoruba_text = tokenizer.decode(outputs[0], skip_special_tokens=True) print(yoruba_text)

Limitations

[Describe any known limitations of the model]

Citation

Downloads last month
6
Safetensors
Model size
582M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support