Du befindest dich hier: FSI Informatik » Prüfungsfragen und Altklausuren » Hauptstudiumsprüfungen » Lehrstuhl 5 » Gedächtnisprotokoll für Deep Learning WS18/19
Gedächtnisprotokoll für Deep Learning WS18/19
Overview of the lecture
Let’s start with the basic. What is the Rosenblatt Perceptron?
Universal approximation theorem:
- equation (there is activation function here)
- how many neurons do we need if we want our loss to be zero: infinity
Activation functions:
- what functions are there
- the problem of Rosenblatt Perceptron (sign activation function, gradient vanishes everywhere)
- how to solve gradient vanish (Relu, then leaky Relu)
Loss function
- how to derive softmax loss / how to derive the l2 loss (assumption, equations)
Backpropagation
- chain rule. How it actually works(L / f3 * f3 / f2 * f2 / dw)
Optimization
- Gradient descent equation
- Momentum equation
Architectures
- Recurrent Neural Network
- Inception block in GoogleNet
Reinforcement Learning
- markov decision process
- what do we do in Reinforcement learning
- state-value function V
- action-value function Q
- how to update Q
- how to find the optimal policy (using greedy)
Object detection
- Could you tell me about segmentation
- how do we do segmentation
- what network do we use to do segmentation
The atmosphere is very relax and Prof. Maier looks at you all the time showing some
concentration on you. Both of the professor and the assistant are nice and friendly.
If you don’t know the answer to the question, just tell him you are not familiar with it
or try to tell something that relates to it. Prof. Maier will help you a little bit with the
question such that you can better understand the question and some hints.
I would say it’s a fair exam, not gonna punish you for some specific point that you
don’t know about.