◀ Back to overview
## Bayesian Statistics

### Summary

EC | 8 |

Location | Vrije Universiteit |

Weeks | 6 - 21 |

Lecture | Tuesday, 14:00 - 16:45 |

Provider | Stochastics (Star) |

Links | Course page (requires login) |

**Prerequisites**

The measure-theoretic probability lecture notes are available at https://staff.fnwi.uva.nl/s.g.cox/mtp_2016.pdf (most relevant background is chapter 1-8, 12 and 14)

**Aim of the course**

Introduction to the theory and practice of Bayesian statistics.

Bayesian statistics is routinely used in various fields of science and applications, and perhaps is experiencing a revival in popularity. In the Bayesian approach one first specifies a so-called prior probability distribution on the parameter space, which represents the initial belief or expert knowledge about the problem. Next, after obtaining data, one updates this prior to the conditional distribution of the parameter given the data, the so-called posterior distribution.

The aim of this course is to give a rigorous introduction to Bayesian statistical procedures and investigate its performance in the usual framework, in which it is assumed that the data are generated according to a given parameter. We shall be concerned with the question whether the posterior distribution is able to reconstruct this parameter, for instance if the amount of data would increase indefinitely, and whether it is reliable for uncertainty quantification. We shall study specific examples of prior distributions (both parametric conjugate and objective ones, and nonparametric ones, such as the Dirichlet process and Gaussian processes). We shall also address the computation of posterior distributions, in particular methods based on simulating a Markov chain whose distribution approximates the posterior distribution (MCMC methods) and variational Bayes methods where the posterior is approximated using optimisation techniques.

**Lecturers **

Eduard Belitser, Botond Szabo