metadata
base_model: google/gemma-2-2b
library_name: peft
license: apache-2.0
datasets:
- cenfis/alpaca-turkish-combined
language:
- en
- tr
Model Description
This model is originally created for Kaggle's competition here.
You can also see the model from Kaggle here.
Trained with ~80k Turkish dataset with 3 epochs. Took around 19 hours with 2x RTX 4090 GPUs.
You can use the model from PEFT, Transformers and Kaggle.
Important Notes
- Use the model with a CUDA supported GPU since it's fine-tuned with bitsandbytes.
Fine-tuned by emre570.