PulseAugur
LIVE 09:52:12
research · [2 sources] ·
0
research

Researchers develop neural networks for scalable Gaussian process covariance kernels

Researchers have developed a new framework for creating scalable and flexible covariance kernels for Gaussian processes (GPs). This method directly learns the covariance structure using deep neural architectures and a regression-type parameterization derived from Vecchia approximations. The approach leverages permutation-preserving functions, inspired by the permutation-equivariant structure in Vecchia factorization, to enhance training stability and data efficiency. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Introduces a novel framework for learning covariance structures in Gaussian processes, potentially improving scalability and data efficiency in machine learning applications.

RANK_REASON The cluster contains an arXiv preprint detailing a novel framework for Gaussian processes using deep neural networks.

Read on arXiv stat.ML →

COVERAGE [2]

  1. arXiv cs.LG TIER_1 · Jian Cao, Nian Liu, Ying Lin ·

    Permutation-preserving Functions and Neural Vecchia Covariance Kernels

    arXiv:2605.05523v1 Announce Type: cross Abstract: We introduce a novel framework for constructing scalable and flexible covariance kernels for Gaussian processes (GPs) by directly learning the covariance structure under a regression-type parameterization induced by Vecchia approx…

  2. arXiv stat.ML TIER_1 · Ying Lin ·

    Permutation-preserving Functions and Neural Vecchia Covariance Kernels

    We introduce a novel framework for constructing scalable and flexible covariance kernels for Gaussian processes (GPs) by directly learning the covariance structure under a regression-type parameterization induced by Vecchia approximations, using deep neural architectures. Specifi…