Stochastic Variance-Reduced Iterative Hard Thresholding in Graph Sparsity Optimization
Fox, Derek, Hernandez, Samuel, Tong, Qianqian
Stochastic optimization algorithms are widely used for large-scale data analysis due to their low per-iteration costs, but they often suffer from slow asymptotic convergence caused by inherent variance. Variance-reduced techniques have been therefore used to address this issue in structured sparse models utilizing sparsity-inducing norms or $\ell_0$-norms. However, these techniques are not directly applicable to complex (non-convex) graph sparsity models, which are essential in applications like disease outbreak monitoring and social network analysis. In this paper, we introduce two stochastic variance-reduced gradient-based methods to solve graph sparsity optimization: GraphSVRG-IHT and GraphSCSG-IHT. We provide a general framework for theoretical analysis, demonstrating that our methods enjoy a linear convergence speed. Extensive experiments validate
Jul-23-2024
- Country:
- North America > United States
- North Carolina (0.04)
- Texas > Brazos County
- College Station (0.04)
- North America > United States
- Genre:
- Research Report (0.50)
- Industry:
- Health & Medicine (0.89)
- Technology: