--- base_model: google/gemma-2-2b library_name: peft license: apache-2.0 datasets: - cenfis/alpaca-turkish-combined language: - en - tr --- ### Model Description This model is originally created for Kaggle's competition [here](https://www.kaggle.com/competitions/gemma-language-tuning). You can also see the model from Kaggle [here](https://www.kaggle.com/models/cemalemrealbayrak/gemma-2-2b-tr/Transformers/3epoch/1). Trained with ~80k Turkish dataset with 3 epochs. Took around 19 hours with 2x RTX 4090 GPUs. You can use the model from PEFT, Transformers and Kaggle. ### **Important Notes** - Use the model with a CUDA supported GPU since it's fine-tuned with bitsandbytes. Fine-tuned by [emre570](https://linktr.ee/emre570).