Goto

Collaborating Authors

 general graphical model


Export Reviews, Discussions, Author Feedback and Meta-Reviews

Neural Information Processing Systems

First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. This is a theory heavy paper regarding the structure learning of antiferromagnetic Ising models. There are two main results in this paper. First, the authors, for the class of statistical algorithms introduced by Feldman et al, provided a computational lower bound for learning general graphical models on p nodes with maximum degree d. Second, the authors showed that a broad class of repelling models on general graphs can be learned using simple algorithms, even without the correlation decay property.


Our GLN provides a general graphical model to retrosynthesis problem, which is compatible with many reasonable

Neural Information Processing Systems

We thank the reviewers for their insightful comments, which we will incorporate into the revised version. We adopt the s2v in our paper since it satisfies these requirements. We will elaborate on the details in our revision. The results are presented in Table 2. Despite the noisiness of the full So our GLN could be further improved with better design choices. We emphasize that the proposed GLN is general enough which is compatible with other parametrizations.


Review for NeurIPS paper: Factor Graph Neural Networks

Neural Information Processing Systems

Weaknesses: The proposed architecture is not particularly novel and experiments can be improved. While the theoretical analysis is quite interesting, it is not significant enough to bypass the aforementioned issues (e.g., the analysis mainly relies on the Lemma 1 proposed by Kohli et al.). While the proposed factor graph neural network (FGNN) is guaranteed to express a family of higher-order interactions, in the end, FGNN is a member of MPNN applied to heterogeneous graph with two types of vertices (random variable and factor). I also think the considered experiments are limited since they only consider the case where (1) training and evaluation are done on the same graph and (2) factors are easily expressed as a representation of fixed dimension. In other words, the considered experiments are not very convincing for showing that the proposed FGNN works across general graphical models.


Efficient Structured Prediction with Latent Variables for General Graphical Models

Schwing, Alexander, Hazan, Tamir, Pollefeys, Marc, Urtasun, Raquel

arXiv.org Machine Learning

In this paper we propose a unified framework for structured prediction with latent variables which includes hidden conditional random fields and latent structured support vector machines as special cases. We describe a local entropy approximation for this general formulation using duality, and derive an efficient message passing algorithm that is guaranteed to converge. We demonstrate its effectiveness in the tasks of image segmentation as well as 3D indoor scene understanding from single images, showing that our approach is superior to latent structured support vector machines and hidden conditional random fields.