**Seminar**:

Thursday 10.00-12.00, Arnimallee 3, SR 130Language: English

Machine learning deals with searching for and generating patterns in data. Although it is traditionally considered a branch of computer science, it heavily relies on mathematical foundations. Thus, it is the primary goal of our seminar to understand these mathematical foundations. In doing so, we will mainly follow the classical monograph [1], putting emphasis on the probabilistic viewpoint. In this semester, we will focus on techniques that allow one to approximate probability distributions based on samples from these distributions. The techniques that we will study are widely used in contemporary machine learning models, in particular in Bayesian neural networks.

Doing exercises (that are present in [1] in abundance) and programming is beyond the seminar’s scope. However, the students are very much encouraged to do both on their own and present the results.

Although the topics of this semester cover advanced machine learning methods, the acquaintance with basics of probability theory [1, Chap. 2] combined with descent mathematical intuition should suffice to prepare the talks and follow the presentations. Knowledge of linear models for regression [1, Chap. 3] and classification [1, Chap. 4] as well as graphical models [1, Chap. 8] would be helpful.

The language of the seminar is English. The grades are based upon presentations and active participation.

In the list of (most) topics below, the numbers in brackets refer to the corresponding sections in [1]. As a complement, the monographs [2,3] are recommended.

**Latent variables:**

1. Mixtures of Gaussians and Expectation-Maximization (EM) algorithm for it (9.2), Implementation and visualization of EM by Valentin Wolf

2. General EM algorithm: part 1 (9.3.0, 9.3.1, 9.3.3)

3. General EM algorithm: part 2 (9.4)

** **

**Approximate inference:**

4. Variational inference: part 1 (10.1.0-10.1.4), (The Code that was presented during the talk - Available via the FU's Gitlab)

5. Variational inference: part 2 (10.1.0-10.1.4)

6. Variational mixture of Gaussians: part 1 (10.2), (Comparing EM and variational inference for mixture of Gaussians)

7. Variational mixture of Gaussians: part 2 (10.2)

8. Variational linear regression (10.3)

9. Local variational methods (10.5)

10. Variational logistic regression (10.6)

11. Expectation propagation: part 1 (10.7)

12. Expectation propagation: part 2 (10.7)

- [1] Christopher M. Bishop, Pattern recognition and machine learning, 2006.
- [2] Kevin P. Murphy, Machine learning. A probabilistic approach, 2012.
- [3] I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, 2016