The 2022/23 course “Bayesian Machine Learning and Information Processing” will start in November 2022 (Q2).
Course goals
This course provides an introduction to Bayesian machine learning and information processing systems. The Bayesian approach affords a unified and consistent treatment of many useful information processing systems.
Course summary
This course covers the fundamentals of a Bayesian (i.e., probabilistic) approach to machine learning and information processing systems. The Bayesian approach allows for a unified and consistent treatment of many model-based machine learning techniques. Initially, we focus on Linear Gaussian systems and will discuss many useful models and applications, including common regression and classification methods, Gaussian mixture models, hidden Markov models and Kalman filters. We will discuss important algorithms for parameter estimation in these models including the Expectation-Maximization (EM) algorithm and Variational Bayes (VB). The Bayesian method also provides tools for comparing the performance of different information processing systems by means of estimating the Bayesian evidence for each model. We will discuss several methods for approximating Bayesian evidence. Next, we will discuss intelligent agents that learn purposeful behavior from interactions with their environment. These agents are used for applications such as self-driving cars or interactive design of virtual and augmented realities. Indeed, in this course we relate synthetic Bayesian intelligent agents to natural intelligent agents such as the brain. You will be challenged to code Bayesian machine learning algorithms yourself and apply them to practical information processing problems.
News and Announcements
-
The solution notebook for the Probabilistic Programming assignment can be downloaded here.
-
Exam rules have been posted at piazza
-
As much as possible we use the Piazza course site for new announcements.
Instructors
- Prof.dr.ir. Bert de Vries (email: bert.de.vries@tue.nl) is the responsible instructor for this course and teaches all lectures with label B.
- Dr. Wouter Kouw (w.m.kouw@tue.nl) teaches all practical sessions on probabilistic programming with label W.
- Magnus Koudahl, Tim Nisslbeck, Sepideh Adamiat and Wouter Nuijten are the teaching assistants. Mr. Koudahl presents the “What is Life?" bonus lecture.
Materials
In principle, you can download all needed materials from the links below.
Books
Please consider downloading the following books/resources:
- Bert de Vries (2022), PDF bundle of all lecture notes for lessons B0 through B12.
- Wouter Kouw (2022), PDF bundle of all probabilistic programming lecture notes for lessons W1 through W4.
- Christopher M. Bishop (2006), Pattern Recognition and Machine Learning. You can also buy a hardcopy, e.g. at bol.com.
- Ariel Caticha (2012), Entropic Inference and the Foundations of Physics.
Software
- We will provide a web-based environment with all necessary software pre-installed and tested, which will allow you to execute code in the lesson notebooks and work on your probabilistic programming assignment. The invitation link will be posted on Piazza.
- If you prefer working on your own machine, we recommend installing Microsoft’s VS Code editor (download) and adding the Jupyter (tutorial) and Julia (tutorial) extensions. Note that we will not support you in installing these tools on your own machine.
Lecture notes, videos and exercises
You can access all lecture notes, videos and exercises online through the links below:
Date | lesson | materials | ||
---|---|---|---|---|
video guides | lecture notes | exercises | ||
16-Nov-2022 | B0: Course Syllabus B1: Machine Learning Overview |
B1 | B0, B1 | |
18-Nov-2022 | B2: Probability Theory Review | B2.1, B2.2 | B2 | B2-ex B2-sol |
23-Nov-2022 | B3: Bayesian Machine Learning | B3.1, B3.2 | B3 | B3-ex B3-sol |
25-Nov-2022 | W1: ProbProg 1 - Introduction to Bayesian inference | W1 | ||
30-Nov-2022 | B4: Factor Graphs and the Sum-Product Algorithm | B4 | B4 | B4-ex B4-sol |
02-Dec-2022 | B5: Continuous Data and the Gaussian Distribution | B5.1, B5.2, B5.3 | B5 | B5-ex B5-sol |
07-Dec-2022 | B6: Discrete Data and the Multinomial Distribution | B6 | B6 | B6-ex B6-sol |
09-Dec-2022 | W2: ProbProg 2 - Message passing on factor graphs | W2 | ||
14-Dec-2022 | B7: Regression | B7 | B7 | B7-ex B7-sol |
16-Dec-2022 | B8: Generative Classification B9: Discriminative Classification |
B8, B9 | B8, B9 | B8-9-ex B8-9-sol |
21-Dec-2022 | B10: Latent Variable Models and Variational Bayes | B10 | B10 | B10-ex B10-sol |
break | ||||
11-Jan-2023 | W3: ProbProg 3 - Regression and classification | W3 | ||
13-Jan-2023 | B11: Dynamic Models | B11 | B11 | B11-ex B11-sol |
18-Jan-2023 | B12: Intelligent Agents and Active Inference | B12 | B12 | B12-ex B12-sol |
20-Jan-2023 | W4 : ProbProg 4 - Mixture and dynamic models | W4 | ||
M1: Bonus Lecture: What is Life? | M1.1, M1.2 | M1 | ||
02-Feb-2023 | written examination (13:30-16:30) | |||
20-Apr-2023 | resit written examination (18:00-21:00) |
- Furthermore, Q&A for each lesson can be accessed at the Piazza course site.
Exam Preparation
-
Please consult the Course Syllabus (lecture notes for 1st class) for advice on how to study the materials.
-
Each year there will be two written exam opportunities. The exams will be in multiple-choice format. Grading rules have been posted at Piazza. You cannot bring notes or books to the written exam sessions. All needed formulas are supplied at the exam sheet.
-
In addition to the materials in the above table, we provide two representative practice written exams:
- 2021-01-18: exam A, solutions A; exam B, solutions B
- 2021-04-15: exam, solutions