Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
samitizerxu
/
Deepseek-R1-Distil-7B-Qwen-DPO-keep-v2
like
0
Safetensors
qwen2
Model card
Files
Files and versions
Community
main
Deepseek-R1-Distil-7B-Qwen-DPO-keep-v2
/
model.safetensors.index.json
Commit History
Upload folder using huggingface_hub
c9d144a
verified
samitizerxu
commited on
Mar 30