This website is no longer updated and only kept for archiving purposes. Please visit the new site on
https://roettgerlab.science

Spring 2018 / DM863
Deep Learning

General Information

Machine learning has become a part in our everydays life, from simple product recommendations to personal electronic assistant to self-driving cars. More recently, through the advent of potent hardware and cheap computational power, “Deep Learning” has become a popular and powerful tool for learning from complex, large-scale data.

In this course, we will discuss the fundamentals of deep learning and its application to various different fields. We will learn about the power but also the limitations of these deep neural networks. At the end of the course, the students will have significant familiarity with the subject and will be able to apply the learned techniques to a broad range of different fields.

Mainly, the following topics will be covered:

  • feedforward neural networks
  • recurrent neural networks
  • convolutional neural networks
  • backpropagation algorithm
  • regularization
  • factor analysis

Lectures

# Date Content Slides Comments
1 Tue, 06.02.2018 Introduction here
2 Thu, 08.02.2018 Recap: Math here
3 Tue, 13.02.2018 Machine Learning Basics here
4 Thu, 15.02.2018 Feed Forward Networks: Part I here
5 Tue, 20.02.2018 Continuation of last Lecture --
6 Thu, 22.02.2018 Feed Forward Networks: Part II here
7 Tue, 27.02.2018 Regularization here
8 Thu, 01.03.2018 Cancelled
9 Tue, 06.03.2018 Convolutional Networks here
10 Thu, 08.03.2018 Continuation of CNN --
11 Tue, 13.03.2018 RNN here
12 Thu, 15.03.2018 Optimization Strategies here
13 Tue, 20.03.2018 Completion of last lecture. -- --
14 Thu, 22.03.2018 Cancelled

Exercises

Mode of the Exercises

You will receive in total 3 mini-projects. You are supposed to work in teams of up to 4 people on the projects and solve them within two weeks. After one week, you will have a Q&A session with your TA. THe week after, please send your solutions to the TA before the exercises session. During the exercise session you will discuss the solutions together with the TA and the next mini-project is handed out.

The first exercise session (14.1) will be a general Introduction to Theano and is not part of the mini-projects.

# Date Questions Download Solutions
1 Wed, 14.02.2018 Small Introduction to Theano -- --
2 Wed, 21.02.2018 Logistic Regression in Theano: Q&A session Project Description
Template
--
3 Wed, 28.02.2018 Logistic Regression in Theano: Discussion Solutions --
4 Wed, 07.03.2018 Convolutional Networks with Keras Q&A Project Description
Template
Keras
5 Wed, 14.03.2018 Convolutional Networks: Discussion Solution
6 Wed, 21.03.2018 RNN & LSTM Project Description
Template
FASTA File
7 Tue, 03.04.2018 RNN & LSTM: Discussion Solutions solution

Procedure of the oral exam

The exam will last about 15-20 minutes. At the beginning, one topic from the list below will be drawn randomly. For each topic the examinee should be prepared to make a short presentation of about 5 minutes. It is allowed to bring one page of hand-written notes (DIN A4 or US-Letter, one-sided) for each of the topics. The examinee will have 2 minutes to briefly study the notes for the drawn topic before the presentation. The notes may be consulted during the presentation if needed but it will negatively influence the evaluation of the examinee's performance. During the presentation, only the blackboard can be used (you cannot use overhead transparencies, for instance).

After the short presentation, additional question about the presentation's topic but also about other topics in the curriculum will be asked.

Below is the list of possible topics and some suggested content. The listed content are only suggestions and is not necessarily complete nor must everything be covered in the short presentation. It is the responsibility of the examinee to gather and select among all relevant information for each topic from the course material. On the course website you can find suggested readings for each of these topics.

Topics for the Oral Exam:

  1. Feed-Forward Networks
    • Function Principle
    • Output Units
    • Hidden Units
    • ...
  2. Backpropagation
    • Function Principle
    • Computational Graphs
    • Backpropagation through time
    • ...
  3. Regularization
    • Over/Underfitting & Model Capacity
    • Parameter Penalties
    • Bagging
    • Dropout
    • ...
  4. Convolutional Neural Networks
    • Function Principle
    • Pooling
    • Initialization of the kernels
    • ...
  5. Recurrent Neural Networks
    • Function Principle
    • Problems with long term memory
    • Long Short Term Memory
    • ...
  6. Optimization for Neural Networks
    • Parameter Initialization
    • Adaptive Learning
    • Batch Normalization
    • Pre-training
    • ...

Materials

  • All lecture slides are relevant for the exams.
  • All readings noted in the lecture list are relevant for the exam.
  • The Deep Learning Book