12.07.2022

the teacher explains too fast and does not gives you time to process information. Also, I think there should be a wider explanation on Eigenvectors, Eigenvalues and linear span.

with
Aleksandar Samsiev
and
Ivan Manov

Providing you with the theoretical and practical foundations you need to apply machine learning techniques with confidence and understanding.

3 hours 32 lessons

Start course
Linear Algebra and Feature Selection is the course that provides you with the knowledge you need to grasp the math processes behind the machine learning algorithms for dimensionality reduction. Mastering the fundamentals of linear algebra will help you develop in-demand practical skills, such as building your own algorithms or choosing the most appropriate existing ones for a specific task you need to solve. The techniques you will learn - feature extraction and feature selection will enable you to handle high-dimensional data efficiently. In addition, you will get familiar with the mathematical concepts behind PCA and LDA, and practice applying these types of analysis using the corresponding Python libraries.

32 High Quality Lessons

0 Practical Tasks

3 Hours of Video

Certificate of Achievement

Working with machine learning is not only about applying algorithms. It’s about understanding their inner workings and how they function. This course gives you insight into the dimensionality reduction algorithms PCA and LDA and explains the math processes behind them.

Understand the math behind machine learning models

Become capable of solving linear equations

Calculate eigenvalues and eigenvectors

Become familiar with basic and advanced linear algebra notions

Determine independency of a set of vectors

Carry out Principal Components Analysis

Perform Linear Discriminant Analysis

Perform Dimensionality Reduction in Python

Compare the performance of PCA and LDA for classification with SVMs

- Linear Algebra Essentials15 Lessons 76 Min
Here, we’ll cover the linear algebra concepts behind the machine learning algorithms for dimensionality reduction. We'll learn about vectors and matrices, linear equations, eigenvalues and eigenvectors, and more.

What Does the Course Cover Free Why Linear Algebra? Free Solving Quadratic Equations Free Vectors Free Matrices Free The Transpose of Vectors and Matrices, the Identity Matrix Free Linear Independence and Linear Span of Vectors Free Basis of a Vector Space, Determinant of a Matrix, Inverse of a Matrix Free Solving Equations of the Form A*x=b Free The Gauss Method Free Other Solutions to the Equation A*x=b Free Determining Linear Independence of a Random Set of Vectors Eigenvalues and Eigenvectors Calculating Eigenvalues Calculating Eigenvectors - Dimensionality Reduction Motivation2 Lessons 7 Min
This section explains the intricacies of the dimensionality reduction process and clarifies why this technique is essential when working with large datasets.

Feature Selection, Feature Extraction, and Dimensionality Reduction The Curse of Dimensionality - Principal Component Analysis (PCA)4 Lessons 31 Min
In this part of the course, we explore the Principal Component Analysis (PCA) - one of the most widely used algorithms for dimensionality reduction. We'll demonstrate a practical example combining both feature extraction and feature selection techniques to achieve the desired goal - reducing the number of dimensions in our dataset.

Principal Component Analysis – Overview A Step-by-Step Explanation of PCA on California Estates – Example The Theory Behind PCA PCA Covariance Matrix in Jupyter – Analysis and Interpretation - Linear Discriminant Analysis (LDA)11 Lessons 60 Min
In this section, we'll cover another dimensionality reduction technique called Linear Discriminant Analysis (LDA). Here, we'll go through another practical example, showing the methodology behind LDA and its efficiency. We'll also make a comparison between this algorithm and PCA, introducing the advantages of both approaches.

Overall Mean and Class Means Linear Discriminant Analysis – Overview LDA: Calculating Within- and Between-Class Scatter Matrices A Step-by-Step Еxplanation of LDA on a Wine Quality Dataset – Example Calculating the Within- and Between-Class Scatter Matrices Calculating Eigenvectors and Eigenvalues for the LDA Analysis of LDA LDA vs. PCA Setting Up the Classifier to Compare LDA and PCA Coding the Classifier for LDA and PCA Analysis of the Training and Testing Times for the Classifier and Its Accuracy

Filter by rating

- 5 stars
- 4 stars
- 3 stars
- 2 stars
- 1 stars

Sort

- Newest
- Oldest

- 1
- 2
- 3
- ...
- 5

with Aleksandar Samsiev and Ivan Manov