21.07.2023

The content is really good but it really should be presented with more detailed explanations and more examples. It should also be presented more slowly. The presenter simply goes too fast and skips some steps.

with
Aleksandar Samsiev
and
Ivan Manov

Providing you with the theoretical and practical foundations you need to apply machine learning techniques with confidence and understanding.

3 hours 32 lessons

Start course
32 High Quality Lessons

0 Practical Tasks

3 Hours of Content

Certificate of Achievement

Linear Algebra and Feature Selection is the course that provides you with the knowledge you need to grasp the math processes behind the machine learning algorithms for dimensionality reduction. Mastering the fundamentals of linear algebra will help you develop in-demand practical skills, such as building your own algorithms or choosing the most appropriate existing ones for a specific task you need to solve. The techniques you will learn - feature extraction and feature selection will enable you to handle high-dimensional data efficiently. In addition, you will get familiar with the mathematical concepts behind PCA and LDA, and practice applying these types of analysis using the corresponding Python libraries.

Working with machine learning is not only about applying algorithms. It’s about understanding their inner workings and how they function. This course gives you insight into the dimensionality reduction algorithms PCA and LDA and explains the math processes behind them.

Understand the math behind machine learning models

Become capable of solving linear equations

Calculate eigenvalues and eigenvectors

Become familiar with basic and advanced linear algebra notions

Determine independency of a set of vectors

Carry out Principal Components Analysis

Perform Linear Discriminant Analysis

Perform Dimensionality Reduction in Python

Compare the performance of PCA and LDA for classification with SVMs

- Linear Algebra Essentials15 Lessons 76 Min
Here, we’ll cover the linear algebra concepts behind the machine learning algorithms for dimensionality reduction. We'll learn about vectors and matrices, linear equations, eigenvalues and eigenvectors, and more.

What Does the Course Cover FreeWhy Linear Algebra? FreeSolving Quadratic Equations FreeVectors FreeMatrices FreeThe Transpose of Vectors and Matrices, the Identity Matrix FreeLinear Independence and Linear Span of Vectors FreeBasis of a Vector Space, Determinant of a Matrix, Inverse of a Matrix FreeSolving Equations of the Form A*x=b FreeThe Gauss Method FreeOther Solutions to the Equation A*x=b FreeDetermining Linear Independence of a Random Set of VectorsEigenvalues and EigenvectorsCalculating EigenvaluesCalculating Eigenvectors - Dimensionality Reduction Motivation2 Lessons 7 Min
This section explains the intricacies of the dimensionality reduction process and clarifies why this technique is essential when working with large datasets.

Feature Selection, Feature Extraction, and Dimensionality ReductionThe Curse of Dimensionality - Principal Component Analysis (PCA)4 Lessons 31 Min
In this part of the course, we explore the Principal Component Analysis (PCA) - one of the most widely used algorithms for dimensionality reduction. We'll demonstrate a practical example combining both feature extraction and feature selection techniques to achieve the desired goal - reducing the number of dimensions in our dataset.

Principal Component Analysis – OverviewA Step-by-Step Explanation of PCA on California Estates – ExampleThe Theory Behind PCAPCA Covariance Matrix in Jupyter – Analysis and Interpretation - Linear Discriminant Analysis (LDA)11 Lessons 60 Min
In this section, we'll cover another dimensionality reduction technique called Linear Discriminant Analysis (LDA). Here, we'll go through another practical example, showing the methodology behind LDA and its efficiency. We'll also make a comparison between this algorithm and PCA, introducing the advantages of both approaches.

Overall Mean and Class MeansLinear Discriminant Analysis – OverviewLDA: Calculating Within- and Between-Class Scatter MatricesA Step-by-Step Еxplanation of LDA on a Wine Quality Dataset – ExampleCalculating the Within- and Between-Class Scatter MatricesCalculating Eigenvectors and Eigenvalues for the LDAAnalysis of LDALDA vs. PCASetting Up the Classifier to Compare LDA and PCACoding the Classifier for LDA and PCAAnalysis of the Training and Testing Times for the Classifier and Its Accuracy

Student feedback

Filter by rating

- 5 stars
- 4 stars
- 3 stars
- 2 stars
- 1 stars

Sort

- Newest
- Oldest

- 1
- 2
- 3
- ...
- 5
- ...
- 8

“Linear algebra and feature selection have shaped the world of machine learning and can be extremely useful in identifying the most efficient approach to manipulating data. This course will help you understand the mathematical concepts behind ML and how to integrate them into the feature selection activity.”

Worked at 365

Linear Algebra and Feature Selection

with Aleksandar Samsiev and Ivan Manov