Reviews: Connectionist Temporal Classification with Maximum Entropy Regularization

Neural Information Processing Systems 

This work presents a method for end-to-end sequence learning, and more specifically in the framework of Connectionist Temporal Classification (CTC). The paper has two main contributions: - The first is a regularization of the training of the CTC objective in order to reduce the over-confidence of the model. In order to do that, the authors propose a method based on conditional entropy. More specifically, the proposed regularization would encourages the model to explore paths that are close to the dominant one. In order to do so, they suppose that the consecutive elements of a sequence have equal spacing.