πŸ›« Big Bird Flight

Big Bird Flight is a fine-tuned version of Google’s BigBird model, optimised for long-text sentiment analysis. It was trained on 2,598 flight review texts, each annotated with a 10-point ordinal sentiment rating ranging from 1 (extremely negative) to 10 (extremely positive).

Big Bird Flight captures nuanced emotional gradients in text, offering richer sentiment analysis than conventional binary classification (e.g., positive vs. negative). This makes it particularly useful for applications requiring fine-grained sentiment understanding from lengthy or detailed customer feedback.

  • Use case: text classification
  • Sentiment class: 1 (extremely negative) to 10 (extremely positive)

πŸ“˜ Model details

  • Base model: google/bigbird-roberta-base
  • Architecture: BigBirdForSequenceClassification
  • Hidden size: 768
  • Layers: 12 transformer blocks
  • Attention type: block-sparse
  • Max sequence length: 4096 tokens
  • Number of classes: 10 [ratings from 1 to 10 (extremely negative/extremely positive)]

🧠 Training Summary

  • Dataset: 2,598 airline passenger reviews.
  • Labels: ordinal scale from 1 (extremely negative) to 10 (extremely positive).
  • Loss function: cross-entropy (classification setup).

πŸ›  Tokenizer

  • Based on SentencePiece Unigram model.
  • Uses a Metaspace tokenizer for subword splitting.
  • Max tokenized input length is set to 128 tokens during preprocessing.

πŸ“Œ Use cases

  • Analyse detailed customer reviews from air travel.
  • Replace coarse binary sentiment models with ordinal sentiment scales.
  • Experiment with ordinal regression techniques in NLP.

πŸ“š Citation

If you use this model in your research or applications, appreciate if you could cite as follow. Mat Roni, S. (2025). Big Bird Flight for ordinal sentiment analysis. Hugging Face. https://huggingface.co/pvaluedotone/bigbird-flight

Validation metrics

The validation metrics reflects the inherent complexity in the fine granularity of the 10-point scale.

  • loss: 1.7985
  • f1_macro: 0.2275
  • f1_micro: 0.2665
  • f1_weighted: 0.2347
  • precision_macro: 0.2676
  • precision_micro: 0.2665
  • precision_weighted: 0.2777
  • recall_macro: 0.2595
  • recall_micro: 0.2665
  • recall_weighted: 0.2665
  • accuracy: 0.2665
Downloads last month
43
Safetensors
Model size
128M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for pvaluedotone/bigbird-flight

Finetuned
(26)
this model

Space using pvaluedotone/bigbird-flight 1