Reshaped Wirtinger Flow for Solving Quadratic System of Equations
–Neural Information Processing Systems
Our work is along the line of the Wirtinger flow (WF) approach Candès et al. [2015], which solves the problem by minimizing a nonconvex loss function via a gradient algorithm and can be shown to converge to a global optimal point under good initialization. In contrast to the smooth loss function used in WF, we adopt a nonsmooth but lower-order loss function, and design a gradient-like algorithm (referred to as reshaped-WF). We show that for random Gaussian measurements, reshaped-WF enjoys geometric convergence to a global optimal point as long as the number m of measurements is at the order of O(n), where n is the dimension of the unknown x. This improves the sample complexity of WF, and achieves the same sample complexity as truncated-WF Chen and Candes [2015] but without truncation at gradient step. Furthermore, reshaped-WF costs less computationally than WF, and runs faster numerically than both WF and truncated-WF. Bypassing higher-order variables in the loss function and truncations in the gradient loop, analysis of reshaped-WF is simplified.
Neural Information Processing Systems
Jan-20-2025, 14:45:30 GMT
- Technology: