Gaussian Process
Also known as: Gaussian Process, Gaussian processes, gaussian process, Gaussian process regression
Facts (17)
Sources
Track: Poster Session 3 - aistats 2026 virtual.aistats.org 14 facts
procedureThe authors developed a Gaussian Process-based approach to construct intervals that contain the true treatment effect with high probability, both inside and outside the support of the experimental data, by assuming smoothness of the correction function.
claimCanonical Polyadic Decomposition (CPD) and TensorTrain (TT)-constrained kernel machines converge in the limit of large ranks to the same Gaussian Process (GP) when specifying appropriate i.i.d. priors across their components.
claimTensorTrain (TT)-constrained models achieve faster convergence to the Gaussian Process (GP) compared to Canonical Polyadic Decomposition (CPD) counterparts for the same number of model parameters.
claimThe Deep Additive Kernel (DAK) model incorporates an additive structure for the last-layer Gaussian Process and induced prior approximation for each Gaussian Process unit, resulting in a last-layer Bayesian neural network (BNN) architecture.
claimThe Gaussian process regression strategy proposed by Raphael Carpintero Perez et al. combines regularized optimal transport, dimension reduction techniques, and Gaussian processes indexed by graphs to enable signal prediction and the calculation of confidence intervals on node values for applications in fluid dynamics and solid mechanics.
procedureYilin Xie, Shiqiang Zhang, Joel Paulson, and Calvin Tsay propose a method for global optimization of Gaussian process acquisition functions that constructs an effective search region consisting of multiple subspaces and optimizes the acquisition function within this region by focusing on important variables.
claimThe method proposed by Yilin Xie, Shiqiang Zhang, Joel Paulson, and Calvin Tsay for global optimization of Gaussian process acquisition functions achieves cumulative regret with a sublinear growth rate in the worst case while maintaining computational efficiency.
claimClassical Gaussian process (GP) training procedures can be interpreted as instantiations of the P2L algorithm, allowing them to inherit tight, self-certified bounds.
claimStandard Gaussian Process (GP) models are limited to continuous variables because establishing correlation structures for categorical variables is difficult.
referenceThe Piecewise-linear Kernel Mixed Integer Quadratic Programming (PK-MIQP) formulation, proposed by Yilin Xie, Shiqiang Zhang, Joel Paulson, and Calvin Tsay, introduces a piecewise-linear approximation for Gaussian process kernels and provides an MIQP representation for acquisition functions applicable to uncertainty-based acquisition functions for any stationary or dot-product kernel.
claimPetar Bevanda, Max Beier, Alexandre Capone, Stefan Sosnowski, Sandra Hirche, and Armin Lederer proposed a family of Gaussian processes for dynamical systems with linear time-invariant responses that are nonlinear only in initial conditions, which allows for tractable quantification of forecasting and representational uncertainty.
claimRaphael Carpintero Perez, Sébastien da Veiga, Josselin Garnier, and Brian Staber proposed a Gaussian process regression strategy for inputs consisting of large, sparse graphs with continuous node attributes, where outputs are signals defined on the nodes of the associated inputs.
claimDeep Kernel Learning (DKL) faces computational challenges when the input dimension of the Gaussian Process layer is high.
procedureThe $q\texttt{POTS}$ method solves multiobjective optimization on Gaussian process (GP) posteriors using evolutionary approaches, and selects new candidates on the posterior GP Pareto frontier using a maximin distance criterion.
A comprehensive overview on demand side energy management ... link.springer.com Mar 13, 2023 2 facts
referenceWeng and Rajagopal presented the paper 'Probabilistic baseline estimation via gaussian process' at the 2015 IEEE Power & Energy Society General Meeting.
referenceWeng and Rajagopal (2015) proposed a probabilistic baseline estimation method using Gaussian processes, presented at the 2015 IEEE Power & Energy Society General Meeting.
A Comprehensive Review of Neuro-symbolic AI for Robustness ... link.springer.com Dec 9, 2025 1 fact
referenceGaussian Process Hybrid Neural Networks combine neural networks with Gaussian processes to estimate predictive uncertainty based on sample density, with the Gaussian process component providing a measure of uncertainty that increases as the test point moves further from the training data.