Last answered:

05 Mar 2023

Posted on:

03 Jan 2023

0

Dimension of vectors / matrix in the model

In the lecture, it is said that the dimension of every object in the model for multiple inputs and outputs Y = xw + b  are:
Y -> 'n x m',
xw -> 'n  x m'
b -> '1 x m'.
In algebra we cannot sum 2 matrix with different dimensions, therefore b should be 'n x m' too.
Am I wrong?

1 answers ( 0 marked as helpful)
Posted on:

05 Mar 2023

0

the y = xw+b refers to a single linear model with one feature (x) if you have more feature (x1,x2) than you have w1,w2 but always only one b.


if you have multiple equatations in one model (y1,y2 aka outputs) then the first sentence adds up. they are different equatations working with the same feature set.

 so if you ask about the weight matrix it is:  No of  featres (No of Xs) *  the No of equatations (No of Ys)

if you ask about the bias matrix it is: equal to the No of equatation (Ys) as every equatations has only one bias.

therefore No of Y = No of b


it is that simple no matter how many letters of the alfabet is involved in the explanation :)


Submit an answer