Skip to content

Regularization can make diffusion models more efficient

Venue: iclr2026 (Reject) Authors: OpenReview: https://openreview.net/forum?id=2AvjgGJg8U

Relevance

LLM score: 3/3 — The paper leverages sparsity to reduce computational complexity of diffusion models, directly aligning with the Sutro Group's emphasis on sparsity and energy-efficient training. Keyword hits: sparsity

TLDR

(none provided)

Abstract

Diffusion models are one of the key architectures of generative AI. Their main drawback, however, is the computational costs. This study indicates that the concept of sparsity, well known especially in statistics, can provide a pathway to more efficient diffusion pipelines. Our mathematical guarantees prove that sparsity can reduce the input dimension’s influence on the computational complexity to that of a much smaller intrinsic dimension of the data. Our empirical findings confirm that inducing sparsity can indeed lead to better samples at a lower cost.

Keywords

diffusion models, Regularization