DistilBERT Emotion Classifier π
This is a fine-tuned DistilBERT model for emotion classification trained on the dair-ai/emotion dataset.
π§ Model Details
- Base model: distilbert-base-uncased
- Task: Text Classification (Emotion Detection)
- Languages: English
- Labels: sadness, joy, love, anger, fear, surprise
- Training epochs: 2
- Batch size: 64
π Evaluation Metrics on Test Data
| Metric | Value |
|---|---|
| Accuracy | 0.919 |
| F1 Score | 0.918 |
Usage
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
model_name = "tsid7710/distillbert-emotion-model"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
text = "I am feeling great today!"
inputs = tokenizer(text, return_tensors="pt")
with torch.inference_mode():
logits = model(**inputs).logits
pred = torch.argmax(logits, dim=-1).item()
print(pred)
- Downloads last month
- 65