Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
albertfares
/
DPO_MCQA_model_3_06_04_08
like
0
Text Generation
Safetensors
albertfares/MNLP_M3_dpo_dataset
English
qwen3
Merge
sft
dpo
math
code
mcqa
mnlp-m3
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
main
DPO_MCQA_model_3_06_04_08
Commit History
Upload merged DPO + MCQA model
43c63c5
verified
albertfares
commited on
Jun 3
initial commit
4c44b57
verified
albertfares
commited on
Jun 3