Emotion Classifier (DeBERTa-v3-base)

F1: 0.8008092786408941

Usage

import torch import torch.nn as nn from transformers import DebertaV2Tokenizer, DebertaV2Model

class TransformerClassifier(nn.Module): def init(self, model_name, n_classes=5): super().init() self.transformer = DebertaV2Model.from_pretrained(model_name) self.classifier = nn.Linear(768, n_classes)

def forward(self, input_ids, attention_mask): outputs = self.transformer(input_ids=input_ids, attention_mask=attention_mask) return self.classifier(outputs.last_hidden_state[:, 0]) tokenizer = DebertaV2Tokenizer.from_pretrained('YOUR_USERNAME/emotion-classifier-deberta') model = TransformerClassifier('YOUR_USERNAME/emotion-classifier-deberta') checkpoint = torch.load('pytorch_model.bin') model.load_state_dict(checkpoint['model_state_dict']) model.eval()

Anish Sharma (23f1001021) - IIT Madras 2025

Downloads last month
15
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support