Backpropagation

Disclaimer: Dieser Thread wurde aus dem alten Forum importiert. Daher werden eventuell nicht alle Formatierungen richtig angezeigt. Der ursprüngliche Thread beginnt im zweiten Post dieses Threads.

Backpropagation
Hi :slight_smile:

I’m trying to recap the meaning of the backpropagation formula. I know in general how backpropagation works, but I just don’t get the formula from the script. Can somebody maybe explain it to me?

Thank you!


Dennis uploaded a few slides of mine (I guess to StudOn, but I am not sure), probably they help you understanding!


Hi :slight_smile:

just to clarfiy with DeltaBefore you refer to the loss in the layer before (right side), with WeightsBefore you refer to the weights of the layer we are currently updating but before they are updated, with errorBefore you refer to the gradient of the loss with respect to the input, with activationOut’ you refer to the derivative of the activation function, with in you refer to the activations of the layer on the left side and in*delta is the gradient of the loss with respect to the weights (slide number 42)? Did I get everything right?


Anybody?