YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
BERT Fine-tuned Model for AI Content Detection ๐พ
This directory contains a fine-tuned BERT model for detecting AI-generated content.
Model Overview
- Base Model: BERT-base-uncased (110M parameters)
- Task: Binary classification (AI-generated vs Human-written text)
- Max Sequence Length: 256 tokens
- Fine-tuned on: COLING 2025 MGT Dataset
HuggingFace Hub Usage
You can use our model directly via HuggingFace Hub:
from transformers import pipeline
# Load the model
classifier = pipeline("text-classification", model="SaherMuhamed/bert-ai-detector-coling-finetuned")
# Example text
text = "Your text to classify here"
# Get prediction
result = classifier(text)
# Print result
print(f"Label: {result[0]['label']}")
print(f"Confidence: {result[0]['score']:.2%}")
Example output:
Label: AI_GENERATED
Confidence: 92.45%
Model Performance
The model uses confidence thresholding (0.6) for more reliable predictions, with the following features:
- Handles texts of any length (automatically truncates to 256 tokens)
- Returns probability scores for both classes
- GPU-compatible with fallback to CPU
Dependencies
- tensorflow
- transformers
- numpy
- torch
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support