Du befindest dich hier: FSI Informatik » Prüfungsfragen und Altklausuren » Hauptstudiumsprüfungen » Lehrstuhl 5 » Deep Learning - Summer Term 2018   (Übersicht)

Deep Learning - Summer Term 2018

Examiner: Prof. Maier

  • Give an overview of the topics of the lecture (I drew a mindmap of all lecture topics)
  • What problems can't be solved by a perceptron? → XOR.
  • What can we do to solve it?
  • Universal approximation theorem. How good can we approximate? How can we better approximate?
  • Why do we need deeper models when we can already model any function with one layer?
  • Explain backpropagation
  • What loss function did we use? Cross entropy loss (classification, multinoulli distribution), L2 loss (regression, gaussian distribution)
  • Short derivation wanted for the loss functions.
  • We talked about hinge loss in the lecture. Can you say something about this? → SVM
  • Can you explain some different optimizers? (I talked about Momentum, NAG and Adam)
  • Name some milestone architectures and say what new ingredient they introduced (I talked about /mentioned LeNet, AlexNet, Network in Network, VGG, ResNet)
  • Then he wanted me to explain ResNet in detail and asked some more questions about it
  • Explain LSTM. He wanted me to draw the block diagram
  • Explain YOLO.
  • How is segmentation working with deep learning?
  • Explain autoencoders

Prof. Maier was very friendly. For preparation I wrote my own summary, this was quite helpful