bayesian-neural-networks
[[neural networks]] in a [[bayesian]] formalism. The general idea is that the initialization schemes for neural networks act as priors, and by training the network, we end up with a parameterized [[posterior distribution]].
Backlinks
accelerating-neural-architecture-exploration
^[1]: I guess more like hypernetwork, where you have a generative model for architectures? From the text alone it's not clear which they're referring to, where the hypernetwork is down the line of [[bayesian-neural-networks]].
variational autoencoder
Is a sub-category of [[bayesian-neural-networks]], whereby variational inference is done, usually in contrast to full [[MCMC]] sampling.
bayesian-network
- Not to be confused with [[bayesian-neural-networks]], this type of model is basically analogous to a [[decision-tree]], albeit as a directed acyclic [[graph]].
scalable-uncertainties-from-deep-ensembles
- A deep ensemble is supposed to be easier to implement and train than [[bayesian-neural-networks]], either variational or MCMC.