Appendix A Related work Expanded

Neural Information Processing Systems 

In the following, we review related work from graph kernels, GNNs, and theory. More recently, graph kernels' developments have emphasized scalability, focusing on Notable instances of this architecture include, e.g., [ Sato et al. studied the limits of GNNs when applied to combinatorial problems. Finally, there exists a new line of work focusing on extending GNNs to hypergraphs, see, e.g., [ We briefly describe the Weisfeiler-Leman algorithm and, along the way, introduce our notation. Let k be a fixed positive integer. The successive refinement steps are also called rounds or iterations .