Question Answering
Transformers
Safetensors
Marathi
bert

MahaBERT-SQuAD

MahaBERT-SQuAD is a MahaBERT model fine-tuned on the translated Marathi question-answering dataset L3Cube-MahaSQuAD. dataset link

More details on the dataset, models, and baseline results can be found in our paper
Citing:

@article{ghatage2024mahasquad,
  title={MahaSQuAD: Bridging Linguistic Divides in Marathi Question-Answering},
  author={Ghatage, Ruturaj and Kulkarni, Aditya and Patil, Rajlaxmi and Endait, Sharvi and Joshi, Raviraj},
  journal={arXiv preprint arXiv:2404.13364},
  year={2024}
}
@article{endait2025indicsquad,
  title={IndicSQuAD: A Comprehensive Multilingual Question Answering Dataset for Indic Languages},
  author={Endait, Sharvi and Ghatage, Ruturaj and Kulkarni, Aditya and Patil, Rajlaxmi and Joshi, Raviraj},
  journal={arXiv preprint arXiv:2505.03688},
  year={2025}
}

Other IndicSQuAD models:

This project is part of the L3Cube-IndicNLP project.

Downloads last month
45
Safetensors
Model size
237M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train l3cube-pune/marathi-question-answering-squad-bert