Review for NeurIPS paper: DAGs with No Fears: A Closer Look at Continuous Optimization for Learning Bayesian Networks

Neural Information Processing Systems 

Weaknesses: The problem in the paper is that it fails in showing the actual scope of the new results, especially in the global context of BNs learning. In fact their methods apparently can only applied to the continuous case: no mention is ever made if the same method can work with categorical variables. This is reflected to the selected set of "state-of-the-art" methods against which they compare their methods, that is a narrow subset of the whole literature on BNs learning. Saying something like "As mentioned, this paper is most closely related to the fully continuous framework of ... " is definitely not enough: a more precise and thorough description of the limitations of this work, and its position in the whole BNs learning literature, is needed. The title and the abstract should modified as well with same reasoning.