The 365 Data Science team is proud to invite you to our own community forum. A very well built system to support your queries, questions and give the chance to show your knowledge and help others in their path of becoming Data Science specialists.
Ask
Anybody can ask a question
Answer
Anybody can answer
Vote
The best answers are voted up and moderated by our team

Why Validation loss is always less than previous loss?

Why Validation loss is always less than previous loss?

0
Votes
1
Answer

https://colab.research.google.com/drive/1DzDru4OYy_BixKr4xUWmG6FQXM0oEI_W
 
Hello,
Please find the link attached, in that Iris data set is used with the code provided for business case, except one step(i.e., Balancing the data set)because all the 3 types of flowers were equally distributed. 
I could find that  validation loss is always less than the previous loss, because of that the model is executing for max epoch times is this correct? 
Best Regards,
Keerthi 
 

1 Answer

365 Team
0
Votes

Hi Keerthi,
The reason for that is that once the validation loss stops increasing, the training is over. Therefore, the validation loss is always decreasing during training. So great observation!
Best,
Iliya

×
Online Data Science Training
SAVE 60%