Goto

Collaborating Authors

 proofoftheorem3


How To Make the Gradients Small Stochastically: Even Faster Convex and Nonconvex SGD

Zeyuan Allen-Zhu

Neural Information Processing Systems

However, in terms of making the gradients small, the original SGD does not give an optimal rate, even when f(x) is convex. If f(x) is convex, to find a point with gradient norm ε, we design an algorithm SGD3withanear-optimalrate eO(ε 2),improvingthebestknownrateO(ε 8/3) of [17].


f3d9de86462c28781cbe5c47ef22c3e5-Supplemental.pdf

Neural Information Processing Systems

The algorithm [62] consider Algorithm 2 for the stochastic generalized linear bandit problem. Assume thatθ is the true parameter of the reward model. Then we consider the lower bounds. For fj(A) = 12(ej1eTj2 +ej2eTj1),A with j1 j2, fj(Ai) is only 1 wheni = j and 0 otherwise. With Claim D.12 and Claim D.11 we get that g C q To get 1), we writeVl = [v1, vl] Rd l and V l = [vl+1, vk].


Andforanyα>0,theLaplaciansatisfies Gλ c1 2α Gλ+2Md whereM =c23α/c1+c22. 2. IfG (x)=kg (x)k2 isC1,then k G (x)k c2kg (x)k,g (x) > G (x) c1G (x). Proof. Claim1.Note Gλ =2 2Fλgλandthat 1 2 c1I 2Fλ= mX

Neural Information Processing Systems

As a remark, using the Krylov-Bogoliubov existence theorem (see Corollary 11.8 of [6]), fixed points to(4)exist as long as one can show{ρt,t 0}istight. The learning rate is set differently foreachtask. Obviously, the HV indicator (Eq.(10)) can also be used as an objective function for optimizing solution sets. For example, [25, 7] greedily add new points to obtain the highest expected HV improvement. However, the landscape of the HV indicator is piece-wise constant (similar to the 0-1 loss in classification) and is difficult to optimize with gradient descent. Particularly, for all the dominated points inthe solution set, their gradient iszero.



210b7ec74fc9cec6fb8388dbbdaf23f7-Supplemental.pdf

Neural Information Processing Systems

Let Z be the set of all indices` [n] such thatα[`] 6= 0. Let Z be the set of all indices` Z such that the`th variable is constrained to be integral.