Generalized Mixability via Entropic Duality Conference Proceeding uri icon



  • Mixability is a property of a loss which characterizes when fast convergence is possible in the game of prediction with expert advice. We show that a key property of mixability generalizes, and the exp and log operations present in the usual theory are not as special as one might have thought. In doing this we introduce a more general notion of $textbackslashPhi$-mixability where $textbackslashPhi$ is a general entropy (textbackslashie, any convex function on probabilities). We show how a property shared by the convex dual of any such entropy yields a natural algorithm (the minimizer of a regret bound) which, analogous to the classical aggregating algorithm, is guaranteed a constant regret when used with $textbackslashPhi$-mixable losses. We characterize precisely which $textbackslashPhi$ have $textbackslashPhi$-mixable losses and put forward a number of conjectures about the optimality and relationships between different choices of entropy.

publication date

  • January 1, 2015

Date in CU Experts

  • January 31, 2016 4:48 AM

Full Author List

  • Reid MD; Frongillo RM; Williamson RC; Mehta N

author count

  • 4