Qwen2.5-3B-Instruct-Turkish-DPO / dpo_metadata.json

Commit History

Upload optimized DPO model (step-400) to prevent overfitting 🚀
b8932b1
verified

yusufbaykaloglu commited on