Goto

Collaborating Authors

 Clustering


Exact recovery and Bregman hard clustering of node-attributed Stochastic Block Model

Neural Information Processing Systems

However, in many scenarios, nodes also have attributes that are correlated with the clustering structure. Thus, network information (edges) and node information (attributes) can be jointly leveraged to design high-performance clustering algorithms. Under a general model for the network and node attributes, this work establishes an information-theoretic criterion for the exact recovery of community labels and characterizes a phase transition determined by the Chernoff-Hellinger divergence of the model.




Supplementary Material for Semantic Image Synthesis with Unconditional Generator JungWoo Chae

Neural Information Processing Systems

This process enables the value (feature maps) to be rearranged (through a weighted sum) to align with the form of the query, thereby reflecting their strong correspondence. The input noise is removed because its stochasticity slows down the training. Given the need for balancing between high correspondence and image quality, we empirically set the weights of our loss terms. To demonstrate the influence of the additional losses introduced in our method, we provide both quantitative and qualitative ablations in Figure S2 and S3, respectively. Nonetheless, caution is warranted when overly increasing the number of clusters.






Scalable Laplacian K-modes

Imtiaz Ziko, Eric Granger, Ismail Ben Ayed

Neural Information Processing Systems

Furthermore, we show that the density modes can be obtained as byproducts of the assignment variables via simple maximum-value operations whose additional computational cost is linear in the number of data points.