T5-Small Shakespeare Q&A Model
A fine-tuned T5-small model specialized for answering questions about William Shakespeare's plays and works. This model has been trained on a comprehensive dataset of Shakespeare-related questions and answers.
π Model Description
- Base Model: google/t5-small (60M parameters)
- Task: Question Answering about Shakespeare's literary works
- Training: Fine-tuned on custom Shakespeare Q&A dataset
- Language: English
- License: Apache 2.0
π Quick Start
from transformers import T5ForConditionalGeneration, T5Tokenizer
# Load model and tokenizer
model_name = "Hananguyen12/T5-Small-QA-Shakespeare"
tokenizer = T5Tokenizer.from_pretrained(model_name)
model = T5ForConditionalGeneration.from_pretrained(model_name)
# Ask a question (use "factual: " prefix)
question = "factual: Who is the protagonist in Hamlet?"
inputs = tokenizer(question, return_tensors="pt", max_length=512, truncation=True)
# Generate answer
outputs = model.generate(
**inputs,
max_length=256,
num_beams=4,
early_stopping=True,
temperature=0.7
)
answer = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(f"Q: {question.replace('factual: ', '')}")
print(f"A: {answer}")
π Supported Topics
This model can answer questions about:
- Characters: Main and supporting characters across all plays
- Plots: Story summaries, key events, and plot points
- Themes: Major themes and literary analysis
- Settings: Time periods and locations of plays
- Quotes: Famous lines and speeches (context)
- Historical Context: Shakespeare's life and times
π― Usage Tips
- Prefix Format: Always use
"factual: "
before your question for best results - Question Types: Works best with factual, direct questions
- Specificity: More specific questions yield better answers
- Play Names: You can ask about any of Shakespeare's major works
π Example Interactions
Input: "factual: What happens to Romeo and Juliet at the end?"
Output: "Romeo and Juliet both die by suicide - Romeo drinks poison and Juliet stabs herself with Romeo's dagger."
Input: "factual: Who are the three witches in Macbeth?"
Output: "The three witches are supernatural beings who prophesy Macbeth's rise to power and influence the tragic events of the play."
Input: "factual: What is the setting of A Midsummer Night's Dream?"
Output: "A Midsummer Night's Dream is set in Athens and the nearby enchanted forest, during ancient Greek times."
βοΈ Technical Details
- Architecture: T5 (Text-to-Text Transfer Transformer)
- Model Size: ~60M parameters
- Max Input Length: 512 tokens
- Max Output Length: 256 tokens
- Training Epochs: 20
- Learning Rate: 3e-4
- Batch Size: 8
π Educational Applications
Perfect for:
- Literature education chatbots
- Shakespeare study assistants
- Interactive learning platforms
- Educational Q&A systems
- Student research tools
β οΈ Limitations
- Specialized only for Shakespeare's works
- Best with factual questions (not creative interpretation)
- Requires "factual: " prefix for optimal performance
- May not handle very complex literary analysis questions
π Citation
@misc{shakespeare-qa-t5,
title={T5-Small Shakespeare Q&A Model},
author={Hananguyen12},
year={2025},
url={https://huggingface.co/Hananguyen12/T5-Small-QA-Shakespeare}
}
π Links
- Model Repository: https://huggingface.co/Hananguyen12/T5-Small-QA-Shakespeare
- Base Model: https://huggingface.co/google/t5-small
- Framework: Hugging Face Transformers
Fine-tuned with β€οΈ for Shakespeare education and literature learning
- Downloads last month
- 26
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support