emailgen-pythia-410m-deduped
This model is a fine-tuned version of EleutherAI/pythia-410m-deduped on email data. It achieves the following results on the evaluation set:
- Loss: 2.1018
- Accuracy: 0.6157
- perplexity: 8.181
Model description
- fine-tuned on dataset of emails for 4 epochs
- intended use: "text completion" of partially written emails
Usage example
from transformers import pipeline
model_tag = "postbot/emailgen-pythia-410m-deduped"
generator = pipeline(
"text-generation",
model=model_tag,
)
prompt = """
Hello,
Following up on the bubblegum shipment."""
result = generator(
prompt,
) # generate
print(result[0]["generated_text"])
Open LLM Leaderboard Evaluation Results
Detailed results can be found here
Metric | Value |
---|---|
Avg. | 26.65 |
ARC (25-shot) | 27.9 |
HellaSwag (10-shot) | 40.04 |
MMLU (5-shot) | 27.35 |
TruthfulQA (0-shot) | 38.2 |
Winogrande (5-shot) | 52.09 |
GSM8K (5-shot) | 0.0 |
DROP (3-shot) | 0.99 |
- Downloads last month
- 19
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for postbot/emailgen-pythia-410m-deduped
Base model
EleutherAI/pythia-410m-deduped