Skip to content
Channels - Finite Neural Networks as Mixtures of Gaussian Processes: From Provable Error Bounds to Prior Selection :: FRELIP Discovery
Home
Search
Guides
Journals
Learning
FRELIP Discovery Search
Open Access Catalog for African Scholarship
Channels
Finite Neural Networks as Mixtures of Gaussian Processes: From Provable Error Bounds to Prior Selection
Search for more channels:
Similar Items: Finite Neural Networks as Mixtures of Gaussian Processes: From Provable Error Bounds to Prior Selection
Channel Options
View Record
Explore related channels
Quick Look
Mixtures of Gaussian Process Experts with SMC^2
Quick Look
Mixtures of Gaussian Process Experts with SMC^2
Quick Look
Mixtures of Gaussian Process Experts with SMC^2
Quick Look
Mixtures of Gaussian Process Experts with SMC^2
Quick Look
Random ReLU Neural Networks as Non-Gaussian Processes
Quick Look
Random ReLU Neural Networks as Non-Gaussian Processes
Quick Look
Random ReLU Neural Networks as Non-Gaussian Processes
Quick Look
Random ReLU Neural Networks as Non-Gaussian Processes
Quick Look
Regret Analysis for Randomized Gaussian Process Upper Confidence Bound
Quick Look
Bayesian Sparse Gaussian Mixture Model for Clustering in High Dimensions
Quick Look
Refined Risk Bounds for Unbounded Losses via Transductive Priors
Quick Look
How good is your Laplace approximation of the Bayesian posterior? Finite-sample computable error bounds for a variety of useful divergences
Quick Look
Posterior Concentrations of Fully-Connected Bayesian Neural Networks with General Priors on the Weights
Quick Look
Bayesian Sparse Gaussian Mixture Model for Clustering in High Dimensions
Quick Look
Bayesian Sparse Gaussian Mixture Model for Clustering in High Dimensions
Quick Look
Bayesian Sparse Gaussian Mixture Model for Clustering in High Dimensions
Quick Look
Coding 1‐Bit Programmable Metasurface Based on Bayesian Learning With a Hierarchical Truncated Gaussian Mixture Prior Model
Quick Look
Neural Network Parameter-optimization of Gaussian Pre-marginalized Directed Acyclic Graphs
Quick Look
EFFECT OF BOOTSTRAPPING ON GAUSSIAN MIXTURE MODEL
Quick Look
Bayesian Scalar-on-Image Regression with a Spatially Varying Single-layer Neural Network Prior
Quick Look
How good is your Laplace approximation of the Bayesian posterior? Finite-sample computable error bounds for a variety of useful divergences
Quick Look
How good is your Laplace approximation of the Bayesian posterior? Finite-sample computable error bounds for a variety of useful divergences
Quick Look
How good is your Laplace approximation of the Bayesian posterior? Finite-sample computable error bounds for a variety of useful divergences
Quick Look
Posterior Concentrations of Fully-Connected Bayesian Neural Networks with General Priors on the Weights
Load more items
View Record
Prev
Explore related channels
Next