Posted on:

13 Mar 2023

0

With the softmax, the last and second last layers must have the same width, isn't it?

Good morning. If in the last layer we have the softmax, since it takes a whole vector as input, it means that the number of neurons of the last layer must be the same of the second last layer, isn't it? For example 3 neurons for the last layer and 3 neurons for the second last layer. For instance, we cannot have 4 neurons for the second last layer and 3 neurons for the last layer (or viceversa), otherwise the sum couldn't be 1. Am I right? Thank you

0 answers ( 0 marked as helpful)

Submit an answer