Accelerated Stochastic Greedy Coordinate Descent by Soft Thresholding Projection onto Simplex

Neural Information Processing Systems 

PrOjection (SOTOPO)" is proposed to exactly solve an In order to improve the convergence rate and reduce the iteration cost further, two important strategies are used in first-order methods: Nesterov's acceleration and stochastic optimization. Nesterov's acceleration is referred to the technique that uses some algebra trick to accelerate first-order algorithms; while stochastic optimization is referred to the method that samples one training This work is supported by the National Natural Science Foundation of China under grant Nos.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found