Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
albertfares
/
DPO_MCQA_model
like
0
Text Generation
Safetensors
albertfares/MNLP_M3_dpo_dataset
English
qwen3
Merge
sft
dpo
math
code
mcqa
mnlp-m3
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
main
DPO_MCQA_model
/
merges.txt
albertfares
Upload merged DPO + MCQA model
8efeb06
verified
17 days ago
raw
Copy download link
history
contribute
delete
Safe
1.67 MB
File too large to display, you can
check the raw version
instead.