Goto

Collaborating Authors

 treewidth




Neural Trees for Learning on Graphs

Neural Information Processing Systems

Graph Neural Networks (GNNs) have emerged as a flexible and powerful approach for learning over graphs. Despite this success, existing GNNs are constrained by their local message-passing architecture and are provably limited in their expressive power. In this work, we propose a new GNN architecture - the Neural Tree. The neural tree architecture does not perform message passing on the input graph, but on a tree-structured graph, called the H-tree, that is constructed from the input graph. Nodes in the H-tree correspond to subgraphs in the input graph, and they are reorganized in a hierarchical manner such that the parent of a node in the H-tree always corresponds to a larger subgraph in the input graph.


Learning Treewidth-Bounded Bayesian Networks with Thousands of Variables

Mauro Scanagatta, Giorgio Corani, Cassio P. de Campos, Marco Zaffalon

Neural Information Processing Systems

Parviainen et al. (2014) adopted an anytime integer linear programming (ILP) Otherwise it returns a sub-optimal DAG with bounded treewidth. Nie et al. (2014) proposed an efficient anytime ILP approach with a polynomial number of constraints Nie et al. (2015) proposed the method S2.