Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
albertfares
/
DPO_MCQA_model
like
0
Text Generation
Safetensors
albertfares/MNLP_M3_dpo_dataset
English
qwen3
Merge
sft
dpo
math
code
mcqa
mnlp-m3
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
main
DPO_MCQA_model
/
README.md
Commit History
Upload merged DPO + MCQA model
8efeb06
verified
albertfares
commited on
Jun 2