Goto

Collaborating Authors

 table 1


3d779cae2d46cf6a8a99a35ba4167977-AuthorFeedback.pdf

Neural Information Processing Systems

Our approach is purely based on 2D convolutions. Nevertheless, it3 outperforms or performs comparably to many more costly 3D models. We thank the reviewers for pointing out some related (or missing) references. The12 Timeception layers involve group convolutions at different time scales while our TAM layers only use depthwise13 convolution. As a result, the Timeception has significantly more parameters than the TAM (10% vs. 0.1% of the14 totalmodelparameters).






A Hand-Crafted Example

Neural Information Processing Systems

The code for our experiments is available at https://github.com/AndyShih12/HyperSPN. To examine the merits of HyperSPNs as discussed in Section 3, we construct a hand-crafted dataset to test the three types of models described in Figure 4: SPN-Large, SPN-Small, and HyperSPN. The hand-crafted dataset is procedurally generated with 256 binary variables and 10000 instances, broken into train/valid/test splits at 70/10/20%. The generation procedure is designed such that the correlation between variable i and j is dependent on the path length between leaves i and j of a complete binary tree over the 256 variables. The exact details can be found in our code.