Supplemental Material: Simple and Scalable Sparse k-means Clustering via Feature Ranking Kenneth Lange 2 Jason Xu Department of Statistical Science, Duke University
–Neural Information Processing Systems
We restate the results and provide their proofs below. Each iteration of Alg. 1 and 2 monotonically decreases the objective h(C, θ) = We will also require some additional notation. Now we are ready to show that the objective function decreases under newly assigned sparse centers when labels are held fixed. Hence we arrive at Equation (2). Together with Equation (1), we thus conclude that the objective function h(C, θ) monotonically decreases at each iteration. Proposition 2. Assume that for any neighborhood N of Θ N almost surely whenever n > M. Because we have assumed selection consistency so that the dimension of Θ Sparse clustering test For this experiment, we follow the simulation setup of Brodinová et al. [2].
Neural Information Processing Systems
May-29-2025, 18:04:06 GMT