PulseAugur
LIVE 08:31:49
research · [7 sources] ·
0
research

Apple advances normalizing flows, researchers explore denoising and state estimation

Apple Machine Learning Research has introduced iTARFlow, an advancement in Normalizing Flow generative models that maintains a likelihood-based objective and uses an iterative denoising procedure for sampling. This method achieves competitive performance on ImageNet resolutions, positioning Normalizing Flows as a viable alternative to diffusion models. The research also provides insights into characteristic artifacts produced by iTARFlow, potentially guiding future improvements in the field. AI

Summary written by gemini-2.5-flash-lite from 7 sources. How we write summaries →

IMPACT Advances in generative models like iTARFlow could lead to more efficient and effective image synthesis and data denoising techniques.

RANK_REASON This cluster contains research papers detailing new methods in generative modeling and state estimation, including advancements in Normalizing Flows and denoising techniques.

Read on arXiv cs.LG →

COVERAGE [7]

  1. Apple Machine Learning Research TIER_1 ·

    Normalizing Flows with Iterative Denoising

    Normalizing Flows (NFs) are a classical family of likelihood-based methods that have received revived attention. Recent efforts such as TARFlow have shown that NFs are capable of achieving promising performance on image modeling tasks, making them viable alternatives to other met…

  2. arXiv cs.LG TIER_1 · Yu Wang, Arnab Ganguly ·

    Variational Smoothing and Inference for SDEs from Sparse Data with Dynamic Neural Flows

    arXiv:2605.05606v1 Announce Type: cross Abstract: Stochastic differential equations (SDEs) provide a flexible framework for modeling temporal dynamics in partially observed systems. A central task is to calibrate such models from data, which requires inferring latent trajectories…

  3. arXiv cs.LG TIER_1 · Rihuan Ke ·

    Learning-based Statistical Refinement for Denoising

    arXiv:2605.04332v1 Announce Type: new Abstract: This work proposes a learning-based statistical refinement method for improving the denoising results of a given denoiser without knowing the precise noise distribution or accessing clean images or calibration data. While there are …

  4. arXiv cs.LG TIER_1 · Lennart R\"ostel, Berthold B\"auml ·

    Denoising Particle Filters: Learning State Estimation with Single-Step Objectives

    arXiv:2602.19651v2 Announce Type: replace-cross Abstract: Learning-based methods commonly treat state estimation in robotics as a sequence modeling problem. While this paradigm can be effective at maximizing end-to-end performance, models are often difficult to interpret and expe…

  5. arXiv cs.LG TIER_1 · Charles Fefferman, Aalok Gangopadhyay, Matti Lassas, Jonathan Marty, Hariharan Narayanan ·

    Denoising data using convex relaxations

    arXiv:2605.02327v1 Announce Type: cross Abstract: We study the problem of denoising observations \(Y_i=X_i+Z_i\), where the latent variables \(X_i\) are sampled from a low-dimensional manifold in \(\mathbb{R}^n\) and the noise variables \(Z_i\) are isotropic Gaussian. We propose …

  6. arXiv cs.LG TIER_1 · Hariharan Narayanan ·

    Denoising data using convex relaxations

    We study the problem of denoising observations \(Y_i=X_i+Z_i\), where the latent variables \(X_i\) are sampled from a low-dimensional manifold in \(\mathbb{R}^n\) and the noise variables \(Z_i\) are isotropic Gaussian. We propose a convex-relaxation estimator that first reduces d…

  7. arXiv stat.ML TIER_1 · Arnab Ganguly ·

    Variational Smoothing and Inference for SDEs from Sparse Data with Dynamic Neural Flows

    Stochastic differential equations (SDEs) provide a flexible framework for modeling temporal dynamics in partially observed systems. A central task is to calibrate such models from data, which requires inferring latent trajectories and parameters from sparse, noisy observations. C…