Why Validation loss is always less than previous loss?
https://colab.research.google.com/drive/1DzDru4OYy_BixKr4xUWmG6FQXM0oEI_W
Hello,
Please find the link attached, in that Iris data set is used with the code provided for business case, except one step(i.e., Balancing the data set)because all the 3 types of flowers were equally distributed.
I could find that validation loss is always less than the previous loss, because of that the model is executing for max epoch times is this correct?
Best Regards,
Keerthi
1 answers ( 0 marked as helpful)
Hi Keerthi,
The reason for that is that once the validation loss stops increasing, the training is over. Therefore, the validation loss is always decreasing during training. So great observation!
Best,
Iliya