constrained adversarial network
Review for NeurIPS paper: Efficient Generation of Structured Objects with Constrained Adversarial Networks
Weaknesses: - The method section looks not self-contained and lacks descriptions of some key components. In particular: * What is Eq.(9) for? Why "the SL is the negative logarithm of a polynomial in \theta" -- where is the "negative logarithm" in Eq.(9)? It looks its practical implementation is discussed in the "Evaluating the Semantic Loss" part (L.140) which involves the Weighted Model Count (WMC) and knowledge compilation (KC). However, no details about KC are presented.
Review for NeurIPS paper: Efficient Generation of Structured Objects with Constrained Adversarial Networks
This work aims at estimating generative distributions of structured objects that satisfy certain semantic constraints (in first-order logic). The authors achieve this goal by adding a "semantic loss" to the GAN's learning objective and using Knowledge compilation (KC) to build a circuit that allows efficient evaluation. Experiments on game-level generation tasks and a molecule generation task support the proposed method. Strengths: i) Incorporating structured constraints in GAN models is both intellectually and practically interesting; ii) The experiments are comprehensive and convincing in most cases; and iii) the paper is clearly written for most parts. The paper is recommended for acceptance.
Efficient Generation of Structured Objects with Constrained Adversarial Networks
Generative Adversarial Networks (GANs) struggle to generate structured objects like molecules and game maps. The issue is that structured objects must satisfy hard requirements (e.g., molecules must be chemically valid) that are difficult to acquire from examples alone. As a remedy, we propose Constrained Adversarial Networks (CANs), an extension of GANs in which the constraints are embedded into the model during training. This is achieved by penalizing the generator proportionally to the mass it allocates to invalid structures. In contrast to other generative models, CANs support efficient inference of valid structures (with high probability) and allows to turn on and off the learned constraints at inference time.
Efficient Generation of Structured Objects with Constrained Adversarial Networks
Di Liello, Luca, Ardino, Pierfrancesco, Gobbi, Jacopo, Morettin, Paolo, Teso, Stefano, Passerini, Andrea
Generative Adversarial Networks (GANs) struggle to generate structured objects like molecules and game maps. The issue is that structured objects must satisfy hard requirements (e.g., molecules must be chemically valid) that are difficult to acquire from examples alone. As a remedy, we propose Constrained Adversarial Networks (CANs), an extension of GANs in which the constraints are embedded into the model during training. This is achieved by penalizing the generator proportionally to the mass it allocates to invalid structures. In contrast to other generative models, CANs support efficient inference of valid structures (with high probability) and allows to turn on and off the learned constraints at inference time. CANs handle arbitrary logical constraints and leverage knowledge compilation techniques to efficiently evaluate the disagreement between the model and the constraints. Our setup is further extended to hybrid logical-neural constraints for capturing very complex constraints, like graph reachability. An extensive empirical analysis shows that CANs efficiently generate valid structures that are both high-quality and novel.