Compute sponsored by Arrow Denmark and Nvidia

  • Developed by: ThatsGroes
  • License: apache-2.0
  • Finetuned from model : unsloth/gemma-2-27b-it

This gemma2 model was trained 2x faster with Unsloth and Huggingface's TRL library.

[codecarbon INFO @ 21:07:45] 2.748063 kWh of electricity used since the beginning.

Downloads last month
15
Safetensors
Model size
27.2B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ThatsGroes/gemma-2-27b-it-SkoleGPT

Finetuned
(2)
this model
Quantizations
2 models