KMUTT-CPE35-thai-mt5base-summarizer V2
This repository contains a fine-tuned version of google/mt5-base for the task of Thai text summarization. The model was trained on 50,000 samples from the ThaiSum dataset and is part of a senior project in the Computer Engineering Department at King Mongkut’s University of Technology Thonburi (KMUTT).
Model Description
- Base model:
google/mt5-base
- Task: Text Summarization (Thai)
- Fine-tuning dataset: ThaiSum (50k samples)
- Quantization: 4-bit
- Max sequence length: 1024 tokens
Evaluation
The performance of the model was evaluated using the ROUGE metric, which is commonly used for assessing the quality of summarization tasks. The evaluation results on the test set are as follows:
- ROUGE-1: 0.5088
- ROUGE-2: 0.2834
- ROUGE-L: 0.5073
- ROUGE-Lsum: 0.5046
Uses
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for EXt1/KMUTT_CPE35_thai-mt5base_summarizer-V2
Base model
google/mt5-base