Reviews: Block Coordinate Regularization by Denoising

Neural Information Processing Systems 

A recent trend in large scale optimization, specially in the machine learning community, was to replace full gradient based algorithm by its coordinate descent counterpart. The idea being to reduce the computational cost of each iteration while enjoying similar rate of convergence. Often, the solution of maximum a posteriori (estimated with a proximal algorithm) is hard to estimate exactly when the prior is not directly available. In that case, the "proximal iteration" is replaced by "Denoised iteration" where the proximal operator of the prior is replaced by another adequate denoising operator. Such algorithm is then based on full vector update just as vanilla (proximal) gradient descent.