Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

mcding-org
/
CorrectDPO-Model-DDP_Pm3B_U0_beta0.10r0.30rho0.20

Model card Files Files and versions Community
CorrectDPO-Model-DDP_Pm3B_U0_beta0.10r0.30rho0.20
Ctrl+K
Ctrl+K
  • 1 contributor
History: 1 commit
mcding's picture
mcding
initial commit
a2a19fc verified about 1 year ago
  • .gitattributes
    1.52 kB
    initial commit about 1 year ago