A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


[Submitted on 3 Jun 2024]

View a PDF of the paper titled A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization, by Sebastian Sanokowski and 2 other authors

View PDF
HTML (experimental)

Abstract:Learning to sample from intractable distributions over discrete sets without relying on corresponding training data is a central problem in a wide range of fields, including Combinatorial Optimization. Currently, popular deep learning-based approaches rely primarily on generative models that yield exact sample likelihoods. This work introduces a method that lifts this restriction and opens the possibility to employ highly expressive latent variable models like diffusion models. Our approach is conceptually based on a loss that upper bounds the reverse Kullback-Leibler divergence and evades the requirement of exact sample likelihoods. We experimentally validate our approach in data-free Combinatorial Optimization and demonstrate that our method achieves a new state-of-the-art on a wide range of benchmark problems.

Submission history

From: Sebastian Sanokowski [view email]
[v1]
Mon, 3 Jun 2024 17:55:02 UTC (3,702 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.