Reviews: RSN: Randomized Subspace Newton
–Neural Information Processing Systems
The paper introduces a new family of randomized Newton methods, based on a prototypical Hessian sketching scheme to reduce the memory and arithmetic costs. Clearly, the idea of using a randomized sketch for the Hessian is not new. However, the paper extends the known results in a variety of ways: The proposed method gets linear convergence rate 1) under the relative smoothness and the relative convexity assumptions (and the method is still scale-invariant). These results also include the known results for the Newton method as a special case. The related work is adequately cited, the similar approaches from the existing literature and their weaknesses are discussed in a short but concise discussion in the paper.
Neural Information Processing Systems
Jan-26-2025, 18:25:57 GMT
- Technology: