Edit model card

fine-tuned-t5-small-turkish-mmlu

The fine-tuned T5-Small model is a question-answering model trained on the Turkish MMLU dataset, which consists of questions from various academic and professional exams in Turkey, including KPSS and TUS. The model takes a Turkish question as input and generates the correct answer. It is designed to perform well on Turkish-language question-answering tasks, leveraging the structure of the T5 architecture to handle text-to-text transformations.

Training Data

@dataset{bayram_2024_13378019, author = {Bayram, M. Ali}, title = {{Turkish MMLU: Yapay Zeka ve Akademik Uygulamalar İçin En Kapsamlı ve Özgün Türkçe Veri Seti}}, month = aug, year = 2024, publisher = {Zenodo}, version = {v1.2}, doi = {10.5281/zenodo.13378019}, url = {https://doi.org/10.5281/zenodo.13378019} }

Training Hyperparameters

learning_rate=5e-5
per_device_train_batch_size=8
per_device_eval_batch_size=8
num_train_epochs=3
weight_decay=0.01 

Training Results

image/png

Metrics

Training loss was monitored to evaluate how well the model is learning and to avoid overfitting. In this case, after 3 epochs, the model achieved a training loss of 0.0749, reflecting its ability to generalize well to the given data.

Downloads last month
2
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for cuneytkaya/fine-tuned-t5-small-turkish-mmlu

Base model

google-t5/t5-small
Finetuned
(1514)
this model

Dataset used to train cuneytkaya/fine-tuned-t5-small-turkish-mmlu

Space using cuneytkaya/fine-tuned-t5-small-turkish-mmlu 1