Reviews: Densely Connected Attention Propagation for Reading Comprehension

Neural Information Processing Systems 

This paper proposes a new neural architecture for reading comprehension. Compared to many other existing neural architectures, this model 1) densely connects all pairs of passage layers and question layers for encoding; 2) uses a component called Bidirectional Attention Connectors (BAC) for connecting any P-layer and Q-layer which employs an FM layer on top of commonly used bi-directional attention. The proposed architecture has been evaluated on four reading comprehension datasets and demonstrates strong empirical results. Overall, although the proposed ideas could be potentially interesting, I think the presented results (in the current presentation format) are not convincing enough. This makes the results less convincing.