image/png

Model Card for NeuralTranslate

THIS MODEL USES GEMMA 3 TEMPLATE.

This is the first official release of NeuralTranslate 27b Machine Translation: Spanish to Nahuatl. The base model is Gemma 3 27b Instruct after being trained in the Axolotl Spanish-Nahuatl Dataset for 4 epochs.

You can donate towards this project at my ko-fi! https://ko-fi.com/irvingernesto

Model Details

Model Description

  • Developed by: Irving Ernesto
  • Funded by: Irving Ernesto
  • Model type: Large Language Model
  • Language(s) (NLP): Spanish & Náhuatl
  • License: MIT
  • Finetuned from model [optional]: Gemma 3 27b

Model Sources [optional]

Uses

Direct Use

[More Information Needed]

Out-of-Scope Use

Translating between any other two pair of languages. E.g., trying to use the model to translate Náhuatl to English won't work. Even using the model to translate from Spanish to Náhuatl is not reliable.

Recommendations

Use the recommended settings for the Gemma 3 model for inference: temperature = 1.0, top_p = 0.95, top_k = 64

How to Get Started with the Model

Use the code below to get started with the model.

[More Information Needed]

Downloads last month
67
Safetensors
Model size
27.4B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Thermostatic/neuraltranslate-27b-mt-nah-es-v1

Quantizations
2 models

Dataset used to train Thermostatic/neuraltranslate-27b-mt-nah-es-v1