The 365 Data Science team is proud to invite you to our own community forum. A very well built system to support your queries, questions and give the chance to show your knowledge and help others in their path of becoming Data Science specialists.
Ask
Anybody can ask a question
Answer
Anybody can answer
Vote
The best answers are voted up and moderated by our team

Confused about terms in backpropagation calculation

Confused about terms in backpropagation calculation

0
Votes
3
Answer

On the Backpropagation Mathematics tab, it goes into the derivation of the update rule for the output layer. Right before section 5, I was wondering about how it got from (y_j – t_j)y_j(1 – y_j)h_i to \delta_j h_i (not sure if we can input equations here). I think I’m unclear as to how \delta_j is defined, I assumed it was the error y_j – t_j, but then there’s an extra y_j(1 – y_j). Thanks!

3 Answers

365 Team
0
Votes

HI Emily,
What you are referring to is the derivative of the L2-norm, I believe.
Either way, sending screenshots of the exact problems would be much better!
Best,
The 365 Team

365 Team
0
Votes

HI Emily,
What you are referring to is the derivative of the L2-norm, I believe.
Either way, sending screenshots of the exact problems would be much better!
Best,
The 365 Team

0
Votes

View post on imgur.com


 
^Here is a link to a screenshot of the image. I understood the separate components but not them together.
 

×
Complete Data Science Training
Save 60%