Last answered:

07 Dec 2021

Posted on:

07 Dec 2021

0

How do you know what width to make your hidden layers?

Why are the hidden layers 9 wide? Is that just an arbitrary width that you chose, or is there some significance to that number?  How do you know what width to make your hidden layers?

1 answers ( 0 marked as helpful)
Posted on:

07 Dec 2021

2

It is completely arbitrarily chosen.  As you will see later on in the course, we will actually use 50 or 100 hidden layers. In one exercise, the width is 5000. The number of hidden units in the hidden layers is a hyperparameter. Similar to the learning rate (which you have seen in the minimal example), we preset the hyperparameters and see what the algorithm has learned. Based on that, later we fine-tune them (change them, trying to get a better model).

Once you reach the coding lectures, you will see the choice of width in practice and it will all be clear. For now, just know that 9 is completely arbitrarily chosen (so that it fits the screen nicely when we explain neural networks).

Submit an answer