Lightweight Spanish-to-English translation model
TL;DR: We’re releasing a small, high-quality Spanish-to-English translation model that runs locally with low latency and strong fluency scores. The model is available on HuggingFace or Minibase.ai for fine-tuning and API calls. You can test out the model immediately, and entirely for free, at the minibase.ai website.
Most translation systems are large and cloud-based. They’re accurate but slow and can’t run locally. Today we're releasing a lightweight (~300MB) Spanish-to-English translation model that can run offline, on your own devices, and performs nearly as well as many larger commercial systems. Its latency is only 111 ms, meaning you get translations nearly instantaneously.
We trained this model in less than an hour, with zero code, using Minibase.
We measured translation quality using METEOR, chrF, and Semantic Similarity.
- METEOR: 79.7 — measures word-level similarity, considering synonyms and order.
- chrF: 72.7 — measures accuracy at the character level, good for Spanish morphology.
- Semantic Similarity: 70.9 — checks that meaning is preserved.
- Latency averages 111 ms, and the model is only 386 MB in size.
Examples
Input (Spanish):
La inteligencia artificial está revolucionando el mundo de la tecnología.
Cada día vemos avances increíbles en el procesamiento del lenguaje natural.
Output (English):
Artificial intelligence is revolutionizing the world of technology.
Every day we see incredible advances in natural language processing.
Input (Spanish):
El gobierno anunció nuevas medidas económicas para enfrentar la inflación.
Se espera que los precios comiencen a estabilizarse en los próximos meses.
Output (English):
The government announced new economic measures to address inflation.
Prices are expected to begin stabilizing in the coming months.
Input (Spanish):
Messi fue nombrado mejor jugador del torneo después de marcar tres goles en la final.
Output (English):
Messi was named the best player of the tournament after scoring three goals in the final.
Like all Minibase models, we're releasing this one under Apache 2.0. You can download it, fine-tune it, or deploy it directly from Minibase Cloud. To share results or feedback, join the Minibase Discord.