Reviews: Beyond Alternating Updates for Matrix Factorization with Inertial Bregman Proximal Gradient Algorithms
–Neural Information Processing Systems
The Bregman Proximal Gradient (BPG) method was introduced in general form with theory by Bolte et al. in [8] for nonconvex optimization, and an inertial version with convex-concave backtracking (CoCaInBPG) was introduced by Mukkamala et al. in [45]. In section 6.6 of the same paper, Mukkamala et al. apply CoCaInBPG to the problem of structured matrix factorization and show good performance, but leave the theory and computational efficiency open. Additionally, BPG without inertia was applied sucessfully to the problem of symmetric non-negative matrix factorization (SNMF) by Dragomir et al. in [20], which appears in its most recent version to also include more general (symmetric) matrix completion. The submitted paper here is concerned with applying BPG and CoCaInBPG to the non-symmetric matrix factorization problem, essentially picking up where [45] left off and providing work complementary to [20]. This paper first restates the results of [8,45] for the specific objective (1.1) of matrix factorization, then makes its two primary contributions. First, the paper introduces a kernel generating distance function h that is appropriate for matrix factorization, which is related to the universal function of [20] but extended non-trivially to the non-symmetric case.
Neural Information Processing Systems
Jan-26-2025, 18:26:28 GMT
- Technology: