Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
We-Want-GPU 's Collections
SFT LLM
LLM Dataset
DPO LLM

DPO LLM

updated Dec 31, 2023
Upvote
-

  • We-Want-GPU/Yi-Ko-6B-DPO-v2

    Text Generation • 6B • Updated Dec 27, 2023 • 4
Upvote
-
  • Collection guide
  • Browse collections
Company
TOS Privacy About Jobs
Website
Models Datasets Spaces Pricing Docs