---
license: mit
language:
- sr
base_model:
- datatab/YugoGPT-Florida
pipeline_tag: text2text-generation
---

- **Developed by:** datatab
- **License:** mit
## 🏆 Results
> Results obtained through the [**Serbian LLM Evaluation Benchmark**](https://huggingface.co/datasets/datatab/serbian-llm-benchmark)
MODEL |
ARC-E |
ARC-C |
Hellaswag |
PiQA |
Winogrande |
BoolQ |
OpenbookQA |
OZ_EVAL |
SCORE |
YugoGPT-Florida |
0.6918 |
0.5766 |
0.4037 |
0.7374 |
0.5782 |
0.8685 |
0.5918 |
0.7407 |
64,85875 |
Yugo55A-GPT |
0.5846 |
0.5185 |
0.3686 |
0.7076 |
0.5277 |
0.8584 |
0.5485 |
0.6883 |
60,0275 |
Yugo60-GPT |
0.4948 |
0.4542 |
0.3342 |
0.6897 |
0.5138 |
0.8212 |
0.5155 |
0.6379 |
55,76625 |
Yugo45-GPT |
0.4049 |
0.3900 |
0.2812 |
0.6055 |
0.4992 |
0.5793 |
0.4433 |
0.6111 |
47,68125 |



# 🏋️ Training Stats




## 💻 Usage
*** Released with permission by datatab *** - GGUF quantized by @MarkoRadojcic
## 💡 Contributions Welcome!
Have ideas, bug fixes, or want to add a custom model? We'd love for you to be part of the journey! Contributions help grow and enhance the capabilities of the **YugoGPT-Florida**.
## 📜 Citation
Thanks for using **YugoGPT-Florida** — where language learning models meet Serbian precision and creativity! Let's build smarter models together. 🚀�
If you find this model useful in your research, please cite it as follows:
```bibtex
@article{YugoGPT-Florida},
title={YugoGPT-Florida},
author={datatab},
year={2024},
url={https://huggingface.co/datatab/YugoGPT-Florida}
}
```