Simplified and Generalized Masked Diffusion for Discrete Data

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


View a PDF of the paper titled Simplified and Generalized Masked Diffusion for Discrete Data, by Jiaxin Shi and 4 other authors

View PDF
HTML (experimental)

Abstract:Masked (or absorbing) diffusion is actively explored as an alternative to autoregressive models for generative modeling of discrete data. However, existing work in this area has been hindered by unnecessarily complex model formulations and unclear relationships between different perspectives, leading to suboptimal parameterization, training objectives, and ad hoc adjustments to counteract these issues. In this work, we aim to provide a simple and general framework that unlocks the full potential of masked diffusion models. We show that the continuous-time variational objective of masked diffusion models is a simple weighted integral of cross-entropy losses. Our framework also enables training generalized masked diffusion models with state-dependent masking schedules. When evaluated by perplexity, our models trained on OpenWebText surpass prior diffusion language models at GPT-2 scale and demonstrate superior performance on 4 out of 5 zero-shot language modeling tasks. Furthermore, our models vastly outperform previous discrete diffusion models on pixel-level image modeling, achieving 2.75 (CIFAR-10) and 3.40 (ImageNet 64×64) bits per dimension that are better than autoregressive models of similar sizes. Our code is available at this https URL.

Submission history

From: Jiaxin Shi [view email]
[v1]
Thu, 6 Jun 2024 17:59:10 UTC (1,166 KB)
[v2]
Wed, 4 Dec 2024 21:49:20 UTC (1,361 KB)
[v3]
Thu, 26 Dec 2024 22:33:58 UTC (1,361 KB)
[v4]
Thu, 16 Jan 2025 08:46:16 UTC (1,298 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.