Edit model card

Description

Best-performing "mBERT-qa-en, skd" model from the paper Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation.

Check the official GitHub repository to access the code used to implement the methods in the paper.

More info coming soon!

How to Cite

To cite our work use the following BibTex:

@misc{carrino2023promoting,
      title={Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation}, 
      author={Casimiro Pio Carrino and Carlos Escolano and José A. R. Fonollosa},
      year={2023},
      eprint={2309.17134},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
Downloads last month
3
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Dataset used to train ccasimiro/mbert-qa-en-skd-self-distill