megagonlabs/subjqa
Updated • 520 • 16
How to use Chetna19/albert-base-v2_qa_model with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("question-answering", model="Chetna19/albert-base-v2_qa_model") # Load model directly
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("Chetna19/albert-base-v2_qa_model")
model = AutoModelForQuestionAnswering.from_pretrained("Chetna19/albert-base-v2_qa_model")This model is a fine-tuned version of albert-base-v2 on the subjqa dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 5.9633 | 1.0 | 32 | 5.9115 |
| 5.8978 | 2.0 | 64 | 5.8574 |
| 5.8558 | 3.0 | 96 | 5.8260 |
| 5.8354 | 4.0 | 128 | 5.8155 |