M2M100 418M Custom
This is a custom-hosted version of Facebook's m2m100_418M
multilingual translation model, deployed for reliable API-based inference.
π Model Summary
- Base Model:
facebook/m2m100_418M
- Supports 100 languages
- Many-to-many multilingual translation
- Hosted with custom
handler.py
for reliable inference endpoint support
π§ Usage
You can use this model through a Hugging Face Inference Endpoint or with the transformers
library directly:
from transformers import M2M100ForConditionalGeneration, M2M100Tokenizer
model = M2M100ForConditionalGeneration.from_pretrained("Raahulthakur/m2m100_418M_custom")
tokenizer = M2M100Tokenizer.from_pretrained("Raahulthakur/m2m100_418M_custom")
- Downloads last month
- 13
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support