Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
mika5883 's Collections
interesting

interesting

updated May 3
Upvote
-

  • DPO Meets PPO: Reinforced Token Optimization for RLHF

    Paper • 2404.18922 • Published Apr 29, 2024 • 1
Upvote
-
  • Collection guide
  • Browse collections
Company
TOS Privacy About Jobs
Website
Models Datasets Spaces Pricing Docs