mBERT-hi-be-MLM-SQuAD-TyDi-MLQA Model Card

Use a pipeline as a high-level helper

from transformers import pipeline

pipe = pipeline("question-answering", model="hapandya/mBERT-hi-be-MLM-SQuAD-TyDi-MLQA")

Load model directly

from transformers import AutoTokenizer, AutoModelForQuestionAnswering

tokenizer = AutoTokenizer.from_pretrained("hapandya/mBERT-hi-be-MLM-SQuAD-TyDi-MLQA") model = AutoModelForQuestionAnswering.from_pretrained("hapandya/mBERT-hi-be-MLM-SQuAD-TyDi-MLQA")

Downloads last month
118
Safetensors
Model size
177M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Datasets used to train hapandya/mBERT-hi-bn-MLM-SQuAD-TyDi-MLQA