|Weeks||6 - 21|
|Lecture||Tuesday, 10:15 - 13:00|
|Links||Course page (requires login)|
Basics of probability and statistics. Measure theory is necessary to understand details of proofs, but not for the flow of the course. Result from asymptotic statistics are used, but briefly explained when needed. Ease with analysis is necessary, but selected topics from Hilbert space theory, Fourier analysis, and complex analysis are explained within context. So preferably you took courses in measure theoretic probability theory and asymptotic statistics at master's level and real, complex and functional analysis at bachelor's level, but all these topics will be explained to the level needed.
Aim of the course
A statistical time series is a sequence of random variables Xt, indexed by an integer t, which is referred to as "time". Thus a time series is a "discrete time stochastic process". Typically the variables are stochastically dependent. One aim is to predict the "future" given observations X1,..., Xn on the "past". Although the basic statistical concepts apply (such as likelihood, mean square errors, etc.), the dependence gives time series analysis a distinctive flavour. The statistical models are concerned with specifying the time relations, and the probabilistic tools (e.g. the central limit theorem) must go beyond results for independent random variables. Even issues as existence of a given type of process needs to be addressed.
This course is an introduction for mathematics students to the theory of statistical time series, including prediction theory, spectral (=Fourier) theory, and parameter estimation.
The variables Xt are typically the states of a system at times t, even though this index may just as well refer to space, or any other discrete label. Time series' are central to econometrics, where they may refer to characteristics of financial assets over time (e.g. daily stock returns), or macro characteristics (e.g. yearly GDP). In this course we pay special attention to the most popular models in econometrics. Other examples of applications of time series are electrical or magnetic measurements, for instance of biological cell characteristics or of brain signals, characteristics of weather and climate, or speech and sound. Time series analysis is thus related to 'signal analysis'.
The course will only touch on applications, and will be theorem-proof oriented.
Among the time series models we discuss are the classical ARMA processes, and also the GARCH and stochastic volatility processes, which have become popular models for financial time series. We study the existence of stationary versions of these processes, and, if time allows, also the unit-root problem and co-integration. State space models include Markov processes and hidden Markov processes. We do not go into much detail in the probabilistic properties of such processes, but methods of parameter estimation apply and we may discuss prediction through the famous Kalman filter.
Within the context of nonparametric estimation we may discuss the ergodic theorem and extend the central limit theorem to dependent ("mixing") random variables. Thus the course is a mixture of probability and statistics, with some Hilbert space theory coming in to develop the spectral theory and the prediction problem.
Many of the procedures that we discuss are implemented in the statistical computer package R, and are easy to use. We recommend trying out these procedures, because they give additional insight that is hard to obtain from theory only. A few homework problems, with clear instructions, may go into this direction, but no serious programming or data analysis will be expected.
Aad van der Vaart, firstname.lastname@example.org