October 2022 - January 2023
Bayesian Networks: Course Information
This course deals with graphical models
, a subject that is an interaction between probability theory and graph theory. The topic provides a natural tool for dealing with a large class of problems containing uncertainty and complexity. These features occur throughout applied mathematics and engineering and therefore the material treated has diverse applications in the engineering sciences. A complex model is built by combining simpler parts, an idea known as modularity
. The uncertainty
in the system is modelled using probability theory; the graph helps to indicate independence structures that enable the probability
distribution to be decomposed into smaller pieces.
represent joint probability models among given variables. Each variable is represented by a vertex in a graph. The direct dependencies between the variables are represented by directed edges
between the corresponding nodes and the conditional probabilities for each variable (that is the probabilities conditioned on the various possible combinations of values for the immediate predecessors in the network) are stored in tables attached to the dependent nodes. Information about the observed value of a variable is propagated through the network to update the probability distributions over other variables that are not observed directly. These influences may also be identified in a ‘backwards’ direction, from dependent variables to their predecessors.
The Bayesian approach to uncertainty ensures that the system as a whole remains consistent and provides a way to apply the model to data. Graph theory helps to illustrate and utilise independence structures within interacting sets of variables, hence facilitating the design of efficient algorithms.
The course has particular emphasis on causal models
, where the directed arrows in the Directed Acyclic Graph represent direct cause to effect relations between the variables. A problem of particular importance is learning a causal effect
simply from observational data. For example, in medical statistics, we may be interested in whether a new drug out-performs existing treatments. We would like to perform a controlled experiment
whereby half of the participants are assigned to a control group (where they are not given the new treatment) and the other half are given the new treatment. It is a well known phenomenon, though, that people are often unwilling to participate in an experiment carried out on that basis; those who agree to participate usually want the new treatment; the sample of people that agree to participate in a control experiment is itself bias.
A question of importance, therefore, is inferring causality from observational data.
This is a ‘hot topic’ and was the subject of the the prize for Economics in memory of Alfred Nobel (click here for press release)
. A more complete scientific description may be found here: (the pdf `Answering Causal Questions using Observational Data may be found by clicking on this link.
The teaching schedule consists of 13 lectures and 13 tutorials. These take place:
: Room 2270
: 5th, 12th, 19th, 26th
: 9th, 16th, 23rd, 30th
11th, 18th, 25th
- Lectures I give 13 lectures (Wednesdays 08.30 - 10.00) based on the course content outlined below.
- Seminars The seminars, given by students, based on the recent Bayesian Networks literature related to the lectures; in particular, we explore the key ideas for causal inference from observational data, which led to the Nobel prize for Card, Angrist and Imbens.
- Conditional independence, graphs and d- separation, Bayesian networks
- Markov equivalence for graph structures, the essential graph.
- Causality and Intervention Calculus
- Evidence: hard evidence, soft evidence, virtual evidence, Jeffrey’s rule and Pearl’s method of Virtual Evidence.
- Parametrising the network and sensitivity to parameter changes.
- Model building and using computer software.
- Decomposable graphs, junction trees and probility updating.
- Factor graphs and the sum product algorithm.
- Bayesian inference, multinomial sampling and the Dirichlet integral; learning the conditional probability potentials for a given graph structure.
- Learning the graph structure; search and score methods, constraint based methods, hybrid methods. Particular attention to some of the most recent search and score algorithms.
The lectures are taken from the text that should become the second edition of the book ‘Bayesian Networks: An Introduction’, which can be found by clicking here.
- 2022-10-05: 08:30 - 10:00 Lecture 1 Bayesian Networks, Graphs, D-separation, Causal interpretation of DAGs, Bayes Ball pp 3 - 21 (Chapter 1)
- 2022-10-12: 08:30 - 10:00 Lecture 2 Graphical Models and Markov Equivalence (Chapter 2 Sections 12.1 and 12.2 pp 29 - 45)
- 2022-10-19: 08:30 - 10:00 Lecture 3 Intervention Calculus I
- 2022-10-26: 08:30 - 10:00 Lecture 4 Intervention Calculus II: Back Door Criterion, Identifiability (Chapter 11: Sections 11.6 - 11.8 pp 216 - 223)
- 2022-11-09: 08:30 - 10:00 Lecture 5: Learning the parameters and Introduction to Structure Learning (Chapter 12 - Bayesian learning and Cooper Herskovitz Likelihood, Chapter 14 - Structure learning introduction)
- 2022-11-16: 08:30 - 10:00 Lecture 6 Decomposable Graphs (Chapter 7)
- 2022-11-23: 08:30 - 10:00 Lecture 7 Exponential Families and Mean Field Approximation
- 2022-11-30: 08:30 - 10:00 Lecture 8 Graphical Models and E-M Algorithm
- 2022-12-07: 08:30 - 10:00 Lecture 9 Variational Bayes
- 2022-12-14: 08:30 - 10:00 Lecture 10 Junction Trees and Message Passing (Chapter 16)
- 2023-01-11: 08:30 - 10:00 Lecture 11 Search and Score, MCMC Learning Algorithms (Chapter 18)
- 2023-01-18: 08:30 - 10:00 Lecture 12 Message Passing and Conditional Gaussian Distributions (Chapters 8 and 10)
- 2023-01-25: 08:30 - 10:00 Lecture 13 Factor Graphs and the Sum Product Algorithm
- 2022-10-05: 10.15 - 11.45 Seminar 1: Pioneering Work of Jerzy Neyman, based on Neyman 1923 and Rubin 1990
- 2022-10-12: 10.15 - 11.45 Seminar 2: Can we learn a causal DAG? Based on Freedman and Humphreys (2000)(Radosław Jurczak)
- 2022-10-19: 10.15 - 11.45 Seminar 3: Using Bayesian Networks to discover relations between Genes, Environment and Disease. Based on Su, Andrew, Karagas, Borsuk (2013) and Predicting Hematological Malignancies with Bayesian Networks. Agrahari, Foroushani, Docking, Chang, Duns, Hudoba, Karsan, Zare(2018)(Maciej Sikora)
- 2022-10-26: 10.15 - 11.45 Seminar 4: Implementation of Bayesian Networks in Bioinformatics. Based on
Ronquist and Huelsenbeck,
Nascimento, Reis and Yang,
Huelsenbeck and Ronquist,
Huelsenbeck, Ronquist et. al.,
Ronquist Teslenko et. al. (Maciej Sikora)
- 2022-11-09: 10.15 - 11.45 Seminar 5: Measurement bias and effect restoration I. Based on Kuroki and Pearl (2014) (Grzegorz Lojek and Szymon Stolarczyk)
- 2022-11-16: 10.15 - 11.45 Seminar 6: Measurement bias and effect restoration II. Based on Kuroki and Pearl (2014) (Grzegorz Lojek and Szymon Stolarczyk)
- 2022-11-23: 10.15 - 11.45 Seminar 7: A Bayesian foundation for individual learning under
uncertainty. Based on Mathys et. al. and Iglesias et. al. (Jan Skorupski)
- 2022-11-30: 10.15 - 11.45 Seminar 8: Counterfactual Probabilities: Computational Methods, Bounds and Applications. Based on Balke and Pearl (1994) (Blanca Gomez Sanz)
- 2022-12-07: 10.15 - 11.45 Seminar 9: Differentiable Bayesian Structure Learning. Based on Lorch et. al. and Zheng et. al. (Mateusz Olko)
- 2022-12-14: 10.15 - 11.45 Seminar 10: A Gibbs Sampler for Learning DAGs. Based on Goudie and Mukherjee (2016)(Mateusz Przyborowski)
- 2023-01-11: 10.15 - 11.45 Seminar 11: Partition MCMC for inference on acyclic digraphs. Based on Kuipers and Moffa (2016)(Tomasz Jabłczyński)
- 2023-01-18: 10.15 - 11.45 Seminar 12: Exchangeable Random Graphs. Based on Orbanz and Roy (2013)(Daniel Murawski)
- 2023-01-25: 10.15 - 11.45 Seminar 13: Edge Exchangeable Graphs Based on Svante Janson: On Edge Exchangeable Graphs
Assessment is based on student seminars based on the Bayesian Networks literature.
Data Data files may be found here.
(Last modified: 6th October 2022 by John M. Noble)