Warsaw University
February - June 2023
Time Series: Course Information
Introduction
A time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Examples of time series are heights of ocean tides, counts of sunspots, and the daily closing value of the Dow Jones Industrial Average.
The study of Time Series is a branch of statistics, used in signal processing, pattern recognition, econometrics, mathematical finance, weather forecasting, intelligent transport and trajectory forecasting, earthquake prediction, electroencephalography, control engineering, astronomy, communications engineering, and largely in any domain of applied science and engineering which involves temporal measurements.
Time series analysis comprises methods for analysing time series data in order to extract meaningful statistics and other characteristics of the data. Time series forecasting is the use of a model to predict future values based on previously observed values.
This course presents Time Series theory and data analysis; the R programming language is used. It is largely centred around the classical Box-Jenkins method and variants.
It is important to recognise the presence of seasonal components in the data and to be able to remove them so as not to confuse them with long term trends and the better the stochastic model, the better the prediction. Time series models are used to separate (or filter) noise from signals.
While covering a wide variety of applications, the course deals mainly with financial modelling.
Course Content
The course will cover the following topics:
- Time series decomposition - trend, seasonal, stationary components: Lag operators, difference equations, Holt-Winters filtering.
- Linear time series models: MA, AR, ARMA, ARIMA, generating polynomials, autocovariance, autocorrelation.
- Estimating the mean, autocovariance and autocorrelation for a linear stationary time series.
- Estimating parameters for the ARMA model: Yule-Walker equations, Innovations algorithm, Hannan-Rissanen, Maximum Likelihood and Least Squares, Order selection.
- Prediction: linear predictors and projections, the Durbin Levinson and Innovations algorithms
- Prediction (concluded): large numbers of observations. Partial correlation. Computing the ACVF of an ARMA.
- ARCH and GARCH models.
- Spectral Analysis, spectral representation of a time series, Orthogonal Increment Process, Interpolation and Detection.
- Estimating the Spectral Density.
- Multivariate Time Series.
- Cointegration
- The Kalman Filter
- Neural networks in Time Series
Course Organisation
The teaching schedule consists of 14 lectures and 14 tutorial sessions.
The lectures are held Mondays 08.30 - 10.00 in 3220. The tutorials are held Mondays 10.15 - 11.45. Some are computer labs which are held in 2044, others are based on written exercises, held in 2280.
The dates of the classes are:
February: 27th
March: 6th, 13th, 20th, 27th
April: 3rd, 10th, 24th
May: 8th, 15th, 22nd, 29th
June: 5th, 12th
Grading Policy
Assessment will be based on
- Participation in the tutorials
- A Data Analysis assignment, where data is analysed using R. Assessment will be made on
(a) quality of the data analysis and (b)
clarity of communication of the results for a non-specialist (imagine that a paying customer has asked you, as a statistical consultant, to analyse the data)
- A written examination on the theoretical aspects of the course.
The examination will be a take-home exam consisting of theoretical questions about stationary processes. It will be possible to obtain a grade 4 based only on the Data Analysis assignment (if it is exceptionally good); to obtain higher grades (4.5, 5 or 5!), it is necessary to also submit a good examination paper.
Course Literature
The recommended texts for the course are:
Data
Click here for the directory containing the data files
Course Notes
The notes will be added week by week here.
(Last modified: 5th June 2023 by John M. Noble)