Du befindest dich hier: FSI Informatik » Prüfungsfragen und Altklausuren » Hauptstudiumsprüfungen » Lehrstuhl 5 » pr-2018-02-22   (Übersicht)

Examiner: Prof. Nöth

Atmosphere: Friendly.

Questions:

K Nearest Neighbor (from exercises)

  • Q: He drew a coordinate system with some points. What step is necessary before applying KNN itself?
  • A: Normalize the data, e.g. to [-1, 1].
  • Q: How does it work?
  • Q: How did we implement it?
  • Q: We learned of something that is related to KNN …
  • A: Bayes Classifier.

Bayes

  • Q: Formulate the Bayes rule and name all parts.
  • Q: Formulate Decision Function.
  • Q: Relation to KNN:
  • A: Error is at most twice of Bayes (not the classifier) Error Rate (see http://cseweb.ucsd.edu/~elkan/151/nearestn.pdf for more info).
  • A: Bayes Classifier with the identity covariance matrix is the same as Nearest Neighbour.
  • Q: How can you calculate the class conditional pdf?
  • A: Assume gaussian distribution and perform Maximum Likelihood.
  • Q: What do you do if the distribution is not known?
  • A: GMM.

Gaussian Mixture Models

  • Q: What are the parameters?
  • A: Mean, covariance matrix and relative portion of samples for every distribution.
  • Q: How can you initialize the parameters?
  • A: Take random samples and calculate parameters with their help.
  • Q: To what can this lead?
  • A: GMM may only converge to a local maximum.
  • Q: What can you do then?
  • A: Try other random samples and take the largest found maximum.
  • Q: Is there a better way to initialize the parameters?
  • A: Apply k-means clustering before and calculate parameters for every cluster. This way you'll probably almost get the correct parameters.

SVM

  • Q: Formulate the optimization problem of hard margin and soft margin SVM.
  • Q: What are the slack variables for?
  • A: Slack variables move samples onto the border of the margin. This way they'll become support vectors.