the teacher explains too fast and does not gives you time to process information. Also, I think there should be a wider explanation on Eigenvectors, Eigenvalues and linear span.
Linear Algebra and Feature Selection is the course that provides you with the knowledge you need to grasp the math processes behind the machine learning algorithms for dimensionality reduction. Mastering the fundamentals of linear algebra will help you develop in-demand practical skills, such as building your own algorithms or choosing the most appropriate existing ones for a specific task you need to solve. The techniques you will learn - feature extraction and feature selection will enable you to handle high-dimensional data efficiently. In addition, you will get familiar with the mathematical concepts behind PCA and LDA, and practice applying these types of analysis using the corresponding Python libraries.
Working with machine learning is not only about applying algorithms. It’s about understanding their inner workings and how they function. This course gives you insight into the dimensionality reduction algorithms PCA and LDA and explains the math processes behind them.
Here, we’ll cover the linear algebra concepts behind the machine learning algorithms for dimensionality reduction. We'll learn about vectors and matrices, linear equations, eigenvalues and eigenvectors, and more.What Does the Course Cover Free Why Linear Algebra? Free Solving Quadratic Equations Free Vectors Free Matrices Free The Transpose of Vectors and Matrices, the Identity Matrix Free Linear Independence and Linear Span of Vectors Free Basis of a Vector Space, Determinant of a Matrix, Inverse of a Matrix Free Solving Equations of the Form A*x=b Free The Gauss Method Free Other Solutions to the Equation A*x=b Free Determining Linear Independence of a Random Set of Vectors Eigenvalues and Eigenvectors Calculating Eigenvalues Calculating Eigenvectors
This section explains the intricacies of the dimensionality reduction process and clarifies why this technique is essential when working with large datasets.Feature Selection, Feature Extraction, and Dimensionality Reduction The Curse of Dimensionality
In this part of the course, we explore the Principal Component Analysis (PCA) - one of the most widely used algorithms for dimensionality reduction. We'll demonstrate a practical example combining both feature extraction and feature selection techniques to achieve the desired goal - reducing the number of dimensions in our dataset.Principal Component Analysis – Overview A Step-by-Step Explanation of PCA on California Estates – Example The Theory Behind PCA PCA Covariance Matrix in Jupyter – Analysis and Interpretation
In this section, we'll cover another dimensionality reduction technique called Linear Discriminant Analysis (LDA). Here, we'll go through another practical example, showing the methodology behind LDA and its efficiency. We'll also make a comparison between this algorithm and PCA, introducing the advantages of both approaches.Overall Mean and Class Means Linear Discriminant Analysis – Overview LDA: Calculating Within- and Between-Class Scatter Matrices A Step-by-Step Еxplanation of LDA on a Wine Quality Dataset – Example Calculating the Within- and Between-Class Scatter Matrices Calculating Eigenvectors and Eigenvalues for the LDA Analysis of LDA LDA vs. PCA Setting Up the Classifier to Compare LDA and PCA Coding the Classifier for LDA and PCA Analysis of the Training and Testing Times for the Classifier and Its Accuracy
with Aleksandar Samsiev and Ivan Manov