 
				Indonesian T5 Language Models
Indonesian T5 models pre-trained with nanoT5 and fine-tuned on IndoNLG tasks. GitHub: https://github.com/LazarusNLP/IndoT5/
  0.2B • Updated • 104 • 2 0.2B • Updated • 104 • 2- Note Baseline T5 model trained on `uonlp/CulturaX` for 65k steps. Achieves an evaluation loss of 2.082, PPL 8.02. 
   - LazarusNLP/IndoNanoT5-base-IndoSum0.2B • Updated • 5- Note `LazarusNLP/IndoNanoT5-base` fine-tuned on IndoSum. State-of-the-art model on IndoSum; R1: 75.29, R2: 71.23, RL: 73.30. 
   - LazarusNLP/IndoNanoT5-base-Liputan6-Canonical0.2B • Updated • 4- Note `LazarusNLP/IndoNanoT5-base` fine-tuned on Liputan6 Canonical. Competitive with IndoBART and mT5 Small; Canonical - R1: 39.76, R2: 22.29, RL: 33.46; Extreme - R1: 33.26, R2: 14.17, RL: 26.21. 
   - LazarusNLP/IndoNanoT5-base-TyDiQA0.2B • Updated • 7- Note `LazarusNLP/IndoNanoT5-base` fine-tuned on TyDiQA. Outperforms IndoBART; EM: 58.94, F1: 72.19. 
   - LazarusNLP/IndoNanoT5-base-XPersona0.2B • Updated • 4- Note `LazarusNLP/IndoNanoT5-base` fine-tuned on XPersona `id`. State-of-the-art model on XPersona `id`; BLEU: 4.0669, SacreBLEU: 4.0669.