ModernBERT-FakeNewsClassifier

Model Description

ModernBERT-FakeNewsClassifier is a fine-tuned version of ModernBERT, optimized for the binary classification task of detecting fake news. This model processes news articles, including their titles, text content, subject, and publication date, to classify them as either real (1) or fake (0). The model is fine-tuned on a dataset containing over 30,000 labeled examples, achieving high accuracy and robustness.

Key Features:

  • Base Model: ModernBERT, designed for long-context processing (up to 8,192 tokens).
  • Task: Binary classification for fake news detection.
  • Architecture Highlights:
    • Rotary Positional Embeddings (RoPE) for long-context support.
    • Local-global alternating attention for memory efficiency.
    • Flash Attention for optimized inference speed.

Dataset

The dataset used for fine-tuning comprises over 30,000 examples, with the following features:

  • Title: The headline of the news article.
  • Text: The main body of the article.
  • Subject: The category or topic of the article (e.g., Politics, Health).
  • Date: The publication date of the article.
  • Label: Binary labels indicating whether the article is fake (0) or real (1).

Notebook: Training and Fine-Tuning

The repository includes the code.ipynb file, which provides:

  • Step-by-step instructions for preprocessing the dataset.
  • Fine-tuning the ModernBERT model for binary classification.
  • Code for evaluating the model using metrics such as accuracy, F1-score, and AUC-ROC.
  • You can directly open and run the notebook to replicate or customize the training process.

Citation

If you use this model in your research or applications, please cite:

@misc{ModernBERT-FakeNewsClassifier,
  author = {Daksh Rathi},
  title = {ModernBERT-FakeNewsClassifier: A Transformer-Based Model for Fake News Detection},
  year = {2024},
  url = {https://huggingface.co/dakshrathi/ModernBERT-base-FakeNewsClassifier},
}
Downloads last month
5
Safetensors
Model size
150M params
Tensor type
F32
·
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for dakshrathi/ModernBERT-base-FakeNewsClassifier

Finetuned
(181)
this model