Model Description

This model is originally created for Kaggle's competition here.

You can also see the model from Kaggle here.

Trained with ~80k Turkish dataset with 3 epochs. Took around 19 hours with 2x RTX 4090 GPUs.

You can use the model from PEFT, Transformers and Kaggle.

Important Notes

  • Use the model with a CUDA supported GPU since it's fine-tuned with bitsandbytes.

Fine-tuned by emre570.

Downloads last month
19
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Model tree for emre570/gemma-2-2b-tr-3epoch

Base model

google/gemma-2-2b
Adapter
(63)
this model

Dataset used to train emre570/gemma-2-2b-tr-3epoch