A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization
Sanokowski, Sebastian, Hochreiter, Sepp, Lehner, Sebastian
Learning to sample from intractable distributions over discrete sets without relying on corresponding training data is a central problem in a wide range of fields, including Combinatorial Optimization. Currently, popular deep learning-based approaches rely primarily on generative models that yield exact sample likelihoods. This work introduces a method that lifts this restriction and opens the possibility to employ highly expressive latent variable models like diffusion models. Our approach is conceptually based on a loss that upper bounds the reverse Kullback-Leibler divergence and evades the requirement of exact sample likelihoods. We experimentally validate our approach in data-free Combinatorial Optimization and demonstrate that our method achieves a new state-of-the-art on a wide range of benchmark problems.
Jun-3-2024
- Country:
- Africa > Ethiopia
- Addis Ababa > Addis Ababa (0.04)
- Asia
- Europe
- Austria
- Upper Austria > Linz (0.04)
- Vienna (0.14)
- France > Hauts-de-France
- United Kingdom > Scotland
- City of Edinburgh > Edinburgh (0.04)
- Austria
- North America
- Canada
- Alberta > Census Division No. 15
- Improvement District No. 9 > Banff (0.04)
- Quebec > Montreal (0.04)
- Alberta > Census Division No. 15
- United States
- California
- Los Angeles County > Long Beach (0.04)
- Santa Clara County > Palo Alto (0.04)
- Colorado > Denver County
- Denver (0.04)
- Louisiana > Orleans Parish
- New Orleans (0.04)
- New York > New York County
- New York City (0.04)
- California
- Canada
- Africa > Ethiopia
- Genre:
- Instructional Material > Course Syllabus & Notes (0.46)
- Research Report (1.00)
- Industry:
- Health & Medicine (0.46)
- Technology: