Review for NeurIPS paper: Regret Bounds without Lipschitz Continuity: Online Learning with Relative-Lipschitz Losses

Neural Information Processing Systems 

First, the main class of losses that the paper introduces, that of relative Lipschitz continuity (Def. In particular, given that the losses are (RLC) then one can recover relative Lipschitz continuity via a direct combination of convexity and Cauchy-Schwartz inequality. Moreover, conversely every relative Lipschitz continuous loss can be seen as (RLC) if one chooses the respective Riemannian metric accordingly; this becomes even more evident for the example that the paper presents, if f(x) x {2} for x\in R, then one can straightforwardly choose the Riemannian metric in such a manner that the respective dual norm would be \ v\ _{x,\ast} v /x and (RLC) follows. That said, this weakens significantly the contributions concerning FTRL and the like, since in Antonakopoulos et. On the other hand, concerning the most intriguing part that of establishing logarithmic regret for the case where the loss functions are in addition relatively strongly convex, there is no obvious way to establish any relevant examples that satisfy simultaneously relative Lipschitz continuity and relative strong convexity, besides of course the euclidean ones.