PulseAugur
LIVE 11:24:53
research · [6 sources] ·
1
research

New generative models leverage Wasserstein flows for faster, higher-quality outputs

Researchers are exploring new methods for generative modeling, focusing on Wasserstein gradient flows to improve efficiency and sample quality. One approach, W-Flow, achieves state-of-the-art one-step generation for images with significantly faster sampling times compared to traditional diffusion models. Other papers investigate optimizing outputs from generative models and the theoretical underpinnings of score-difference flows, linking different generative modeling techniques and identifying potential obstructions for certain flow types. AI

Summary written by gemini-2.5-flash-lite from 6 sources. How we write summaries →

IMPACT Advances in Wasserstein gradient flows and one-step generation promise faster, more efficient AI models for complex tasks.

RANK_REASON Multiple arXiv papers detailing new theoretical and algorithmic approaches to generative modeling.

Read on arXiv stat.ML →

COVERAGE [6]

  1. arXiv stat.ML TIER_1 · Yu-Jui Huang, Zachariah Malik ·

    Generative Modeling by Minimizing the Wasserstein-2 Loss

    arXiv:2406.13619v4 Announce Type: replace Abstract: This paper develops a generative model by minimizing the second-order Wasserstein loss (the $W_2$ loss) through a distribution-dependent ordinary differential equation (ODE), whose dynamics involves the Kantorovich potential ass…

  2. arXiv stat.ML TIER_1 · Samuel Willis, Paul Duckworth, Jack Simons, Aleksandra Kalisz, Krisztina Sinkovics, Noam Ghenassia, Shikha Surana, Henry T. Oldroyd, Alexandru I. Stere, Dragos D Margineantu, Carl Henrik Ek, Henry Moss, Erik Bodin ·

    Sample-Efficient Optimisation over the Outputs of Generative Models

    arXiv:2509.23800v3 Announce Type: replace Abstract: Modern generative AI models, such as diffusion and flow matching models, can sample from rich data distributions. However, many applications, especially in science and engineering, require more than drawing samples from the mode…

  3. arXiv stat.ML TIER_1 · Romann M. Weber ·

    The Score-Difference Flow for Implicit Generative Modeling

    arXiv:2304.12906v4 Announce Type: replace-cross Abstract: Implicit generative modeling (IGM) aims to produce samples of synthetic data matching the characteristics of a target data distribution. Recent work (e.g. score-matching networks, diffusion models) has approached the IGM p…

  4. arXiv stat.ML TIER_1 Deutsch(DE) · Jiaqi Han, Puheng Li, Qiushan Guo, Renyuan Xu, Stefano Ermon, Emmanuel J. Cand\`es ·

    One-Step Generative Modeling via Wasserstein Gradient Flows

    arXiv:2605.11755v1 Announce Type: cross Abstract: Diffusion models and flow-based methods have shown impressive generative capability, especially for images, but their sampling is expensive because it requires many iterative updates. We introduce W-Flow, a framework for training …

  5. arXiv stat.ML TIER_1 Deutsch(DE) · Emmanuel J. Candès ·

    One-Step Generative Modeling via Wasserstein Gradient Flows

    Diffusion models and flow-based methods have shown impressive generative capability, especially for images, but their sampling is expensive because it requires many iterative updates. We introduce W-Flow, a framework for training a generator that transforms samples from a simple …

  6. arXiv stat.ML TIER_1 · Panos Tsimpos, Daniel Sharp, Youssef Marzouk ·

    One-Shot Generative Flows: Existence and Obstructions

    arXiv:2604.15439v3 Announce Type: replace Abstract: We study dynamic measure transport for generative modeling, focusing on transport maps that connect a source measure $P_0$ to a target measure $P_1$ by integrating a velocity field of the form $v_t(x) = \mathbb{E}[\dot X_t \mid …