π€ Intent Detection using Fine-Tuned BERT
This project utilizes a fine-tuned BERT model (bert-base-uncased) for intent classification tasks. It is an encoder-only transformer designed to detect user intents from text inputs (e.g., chatbot queries) and classify them into predefined categories such as banking, travel, finance, and more.
The model is trained on the CLINC150 (clinc_oos) dataset and evaluated using accuracy as the primary metric.
π Dataset --> CLINC150
The project uses the CLINC150 dataset, a benchmark dataset for intent classification in task-oriented dialogue systems.
π§Ύ Dataset Overview
- Total intents: 150 unique user intents
- Domains: 10 real-world domains (e.g., banking, travel, weather, small talk)
- Examples: ~22,500 utterances
- Language: English
- Out-of-scope (OOS): Includes OOS examples to test robustness
π¦ Source
- Official repo: clinc/oos-eval
- Hugging Face:
clinc_oos
π Example
Request: "I want to book a flight"
Response: "book_flight"
- Downloads last month
- 3
Model tree for SaherMuhamed/bert-intention-classifier
Base model
google-bert/bert-base-uncased