Last answered:

23 Mar 2020

Posted on:

23 Mar 2020

0

Confused about terms in backpropagation calculation

On the Backpropagation Mathematics tab, it goes into the derivation of the update rule for the output layer. Right before section 5, I was wondering about how it got from (y_j - t_j)y_j(1 - y_j)h_i to \delta_j h_i (not sure if we can input equations here). I think I'm unclear as to how \delta_j is defined, I assumed it was the error y_j - t_j, but then there's an extra y_j(1 - y_j). Thanks!

3 answers ( 0 marked as helpful)
Instructor
Posted on:

23 Mar 2020

0
HI Emily, What you are referring to is the derivative of the L2-norm, I believe. Either way, sending screenshots of the exact problems would be much better! Best,
The 365 Team
Instructor
Posted on:

23 Mar 2020

0
HI Emily, What you are referring to is the derivative of the L2-norm, I believe. Either way, sending screenshots of the exact problems would be much better! Best,
The 365 Team
Posted on:

23 Mar 2020

0
https://imgur.com/a/GRQEtiw   ^Here is a link to a screenshot of the image. I understood the separate components but not them together.  

Submit an answer