Reviews: Convolutional Neural Fabrics

Neural Information Processing Systems 

The idea is quite interesting and timely: eliminating some of the many hyperparameters that need to be tuned in CNN design is a welcome development with potentially high impact. I like how Figure 5 demonstrates that the learning process is indeed capable of configuring the trellis as needed. It is somewhat unfortunate that all experiments in the paper are conducted with fabrics with a constant number of channels per scale: this goes against common practice in designing CNNs and leads to wasted capacity. It means that the size of the representation shrinks as the level of abstraction increases, which is typically counteracted by having more channels at higher abstraction levels. The paper states that experiments with channel doubling are ongoing, but these should really be part of the paper as they are much more relevant.