Last answered:

26 Aug 2024

Posted on:

22 Aug 2024

0

Resolved: PCA analysis components

why did you choose only 2 components for PCA , cause choosing only 2 means that the variance is below 80%. how is this a fair compariso between them?
3 answers ( 1 marked as helpful)
Instructor
Posted on:

22 Aug 2024

0
Hi Doaa!
Thanks for reaching out!
In the lesson Analysis of LDA, we discover that the variance retained by the first two eigenvectors is around 95% (much more than the required 80%). That's why we decide to opt for these two eigenvectors and project the data onto these two linear discriminants for LDA and use two principal components for PCA.   
Hope this helps.
Best,
Ivan
Posted on:

22 Aug 2024

0
Thank you for replying. But I still don't get it. Because when I used PCA() without any predetermined arguments, then used explained ratio to see how many components we should use for at least 80% var, it was 6 not 2
Instructor
Posted on:

26 Aug 2024

0
Hi Doa!
I see where the confusion might be coming from. Let me clarify.
We compare the training and testing times for the classifier, as well as its accuracy, using both PCA and LDA with two components each. We do this to directly compare the performance of the classifier when the data is reduced to the same number of dimensions using both methods.
While it's true that using only two components for PCA doesn't capture 80% of the variance, our goal was to see how the classifier performs in a reduced two-dimensional space for both PCA and LDA. This approach helps us understand how well each method retains important features for classification and its impact on training and testing speed. To ensure a fair comparison, both methods reduce the data to the same number of dimensions.
Hope this helps.
Best,
Ivan

Submit an answer