Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
pladee42
's Collections
DPO Hybrid Models
DPO Synthetic Models
DPO Hybrid Models
updated
16 days ago
Models that are fine-tuned using human-preferred emails and AI-generated emails
Upvote
-
pladee42/TinyLlama-1.1B-Email-DPO-Hybrid
Text Generation
•
1B
•
Updated
15 days ago
pladee42/StableLM2-1.6B-Email-DPO-Hybrid
Text Generation
•
2B
•
Updated
15 days ago
pladee42/Phi3-Mini-Email-DPO-Hybrid
Text Generation
•
4B
•
Updated
15 days ago
pladee42/Vicuna-7B-Email-DPO-Hybrid
Text Generation
•
7B
•
Updated
15 days ago
pladee42/Llama3-8B-Email-DPO-Hybrid
Text Generation
•
8B
•
Updated
15 days ago
Upvote
-
Share collection
View history
Collection guide
Browse collections