Overview

Recent developments in neural networks (aka “deep learning”) have drastically advanced the performance of machine perception systems in a variety of areas including computer vision, robotics, and human shape modeling This course is a deep dive into deep learning algorithms and architectures with applications to a variety of perceptual and generative tasks.


Announcements

24.04.2024
The mock exam is now available here.
05.02.2024
Project descriptions have been added here!
03.01.2024
More info coming soon!

Learning Objectives

Students will learn about fundamental aspects of modern deep learning approaches for perception and generation. Students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in learning-based computer vision, robotics, and shape modeling. The optional final project assignment will involve training a complex neural network architecture and applying it to a real-world dataset.

The core competency acquired through this course is a solid foundation in deep-learning algorithms to process and interpret human-centric signals. In particular, students should be able to develop systems that deal with the problem of recognizing people in images, detecting and describing body parts, inferring their spatial configuration, performing action/gesture recognition from still images or image sequences, also considering multi-modal data, among others.

We will focus on teaching: how to set up the problem of machine perception, the learning algorithms, network architectures, and advanced deep learning concepts in particular probabilistic deep learning models.
The course covers the following main areas:
I) Foundations of Deep Learning.
II) Advanced topics like probabilistic generative modeling of data (latent variable models, generative adversarial networks, auto-regressive models, invertible neural networks).
III) Deep learning in computer vision, human-computer interaction, and robotics.


Lecture Notes

You can download the lecture notes here (you will need to log in with your ETH LDAP).

These lecture notes are provided as a draft version for educational purposes only. The content presented herein is subject to change and may contain inaccuracies or errors. Grading for the course will be based on slides and the lecture materials.


Schedule

Subject to change. Materials only available from within ETH network.

Wk.Date ContentMaterial Exercise Session
1 21.02
Deep Learning Introduction

Class content & admin

slides
1 22.02
-- No Class --
2 28.02
29.02
Training Neural Networks

Backpropagation
Feedforward Networks,
Representation Learning

slides pt. I
slides pt. II

Perceptron Visualization Notebook

Tutorial Implement your own MLP

slides
XOR Notebook
XOR Solutions
Eye-Gaze Notebook
Eye-Gaze Solutions

Tutorial Linear Regr.

slides
Linear Regression Notebook

Pen & Paper Backprop.

exercise
exercise solution
3 06.03.
07.03.
Convolutional Neural Networks

slides pt. I
slides pt. II
slides pt. III

Additional material:
RSVP
Cortical Neuron -->

Tutorial CNNs in Pytorch

slides
CNN Notebook

Pen & Paper CNN

exercise
exercise solution
4 14.03.
Recurrent Neural Networks

LSTM, GRU, Backpropagation through time

slides

Tutorial RNNs in Pytorch

slides
RNN Notebook

Pen & Paper RNN

exercise
exercise solution
5 20.03.
21.03
Generative Models Pt. I: Latent Variable Models

Variational Autoencoders, etc.

slides pt. I
slides pt. II

Class Tips for Training I

slides

Pen & Paper VAE

exercise
exercise solution
6 27.03.
28.03
Generative Models Pt. II: Autoregressive Models

Nade, Made, PixelCNN, PixelRNN, WaveNet

slides pt. I
slides pt. II

Pen & Paper AR

exercise
exercise solution
7 03.04.
04.04.
-- No Class or tutorials (Easter) --
8 10.04.
11.04.
Generative Models Pt. III: Normalizing Flows and Invertible Neural Networks

slides NF Pt. I
slides NF Pt. II

Class Tips for Training II

slides

Pen & Paper NF

exercise
exercise solution
9 17.04.
18.04.
Generative Models Pt. IV: Implicit Models

Generative Adversarial Networks & Co

slides GAN Pt. I
slides GAN Pt. II

Tutorial Exercise Discussion and Cluster

slides Backprop&CNN
slides cluster

Pen & Paper GAN

exercise
exercise solution
10 24.04
25.04
Generative Models Pt. V: GAN Applications and Diffusion Models
slides GAN Pt. III
slides Diffusion Models

Tutorial RNN&VAE

slides RNN&VAE

Pen & Paper class Diffusion Models

exercise
exercise solution
11 02.05.
Reinforcement Learning
slides RL pt. I

Tutorial AR&NF

slides AR&NF

Pen & Paper RL

exercise
exercise solutions

Exercise Sessions

Please refer to the above schedule once available for an overview of the planned exercise slots. We will have three different types of activities in the exercise sessions:

  1. Tutorial: Interactive programming tutorial in Python taught by a TA. Code will be made available.
  2. Class: Lecture-style class taught by a TA to give you some tips on how to train your neural network in practice.
  3. Pen & Paper: Pen & paper exercises that are not graded but are helpful to prepare for the written exam. Solutions will be published on the website a week after the release and discussed in the exercise session if desired.


Project

Overview

There will be a multi-week project that gives you the opportunity to have some hands-on experience with training a neural network for a concrete application.

The project grade will be determined by two factors: 1) a competitive part based on how well your model fairs compared to your fellow students' models and 2) the idea/novelty/innovativeness of your approach based on a written report to be handed in by the project deadline. For each project there will be baselines available that guarantee a certain grade for the competitive part if you surpass them. The competition will be hosted on an online platform.

Check out the project descriptions here (you will need to log in with your ETH LDAP).


Exam

To give you a rough idea what to expect for the exam, we release a mock exam which you can download here:

Cheatsheet policy in final exam: 2 double-sided sheets of A4 paper (4 sides in total) OR 4 sheets but written on 1 side (still 4 sides in total); Notes can be in digital (printed) or written form. It is not allowed to copy images or similar into digital notes and making digital notes excessively small; Font size of digital font should be no smaller than 11pt.


Registration as Non-primary Target Group

Registrations have been closed.