Super learner
This user is a Super Learner. To become a Super Learner, you need to reach Level 8.
With the softmax, the last and second last layers must have the same width, isn't it?
Good morning. If in the last layer we have the softmax, since it takes a whole vector as input, it means that the number of neurons of the last layer must be the same of the second last layer, isn't it? For example 3 neurons for the last layer and 3 neurons for the second last layer. For instance, we cannot have 4 neurons for the second last layer and 3 neurons for the last layer (or viceversa), otherwise the sum couldn't be 1. Am I right? Thank you
0 answers ( 0 marked as helpful)