Emotion

- Prompt
- None
- Negative Prompt
- None
Model description
Emotion Recognition Model (BERT-based) ๐ Overview
This is a BERT-based emotion recognition model that I created purely for educational and learning purposes. The model was trained as part of my journey to understand transformers, distillation, GPU management, fine-tuning, and Hugging Face workflows.
โ๏ธ How I built it
I started with a pretrained BERT model.
I experimented with layer distillation (copying a few layers into a smaller student model).
I trained it on an emotion classification dataset to predict different emotional states from text.
I focused on hands-on practice: learning about tokenization, GPU memory issues, checkpointing, and model saving/loading.
โ ๏ธ Disclaimer
This model is not production-ready.
It is not optimized for real-world use.
It should not be used for commercial, fine-tuning, or deployment purposes.
It was built only as a learning exercise to explore Hugging Face and model training.
๐ก Purpose
To help me (and maybe others) understand how Hugging Face works.
To practice model distillation and fine-tuning techniques.
To learn the workflow of pushing models to Hugging Face Hub.
๐ซ Limitations
Accuracy and reliability are not guaranteed.
Not suitable for critical applications (mental health, customer service, etc.).
Limited number of layers and trained on a small dataset.
Download model
Download them in the Files & versions tab.
- Downloads last month
- 14
Model tree for Abdullah6395/Text_Emotion_Recognition
Base model
ProsusAI/finbert