Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
joshhu1123
/
DPO-mistral-no1
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Use this model
main
DPO-mistral-no1
Commit History
Upload model
b58fa49
joshhu1123
commited on
Nov 22, 2023
initial commit
e8784da
Josh Woo
commited on
Nov 22, 2023