21.07.2023
The content is really good but it really should be presented with more detailed explanations and more examples. It should also be presented more slowly. The presenter simply goes too fast and skips some steps.
Providing you with the theoretical and practical foundations you need to apply machine learning techniques with confidence and understanding.
Linear Algebra and Feature Selection is the course that provides you with the knowledge you need to grasp the math processes behind the machine learning algorithms for dimensionality reduction. Mastering the fundamentals of linear algebra will help you develop in-demand practical skills, such as building your own algorithms or choosing the most appropriate existing ones for a specific task you need to solve. The techniques you will learn - feature extraction and feature selection will enable you to handle high-dimensional data efficiently. In addition, you will get familiar with the mathematical concepts behind PCA and LDA, and practice applying these types of analysis using the corresponding Python libraries.
Working with machine learning is not only about applying algorithms. It’s about understanding their inner workings and how they function. This course gives you insight into the dimensionality reduction algorithms PCA and LDA and explains the math processes behind them.
Here, we’ll cover the linear algebra concepts behind the machine learning algorithms for dimensionality reduction. We'll learn about vectors and matrices, linear equations, eigenvalues and eigenvectors, and more.
This section explains the intricacies of the dimensionality reduction process and clarifies why this technique is essential when working with large datasets.
In this part of the course, we explore the Principal Component Analysis (PCA) - one of the most widely used algorithms for dimensionality reduction. We'll demonstrate a practical example combining both feature extraction and feature selection techniques to achieve the desired goal - reducing the number of dimensions in our dataset.
In this section, we'll cover another dimensionality reduction technique called Linear Discriminant Analysis (LDA). Here, we'll go through another practical example, showing the methodology behind LDA and its efficiency. We'll also make a comparison between this algorithm and PCA, introducing the advantages of both approaches.
Student feedback
“Linear algebra and feature selection have shaped the world of machine learning and can be extremely useful in identifying the most efficient approach to manipulating data. This course will help you understand the mathematical concepts behind ML and how to integrate them into the feature selection activity.”
Worked at 365
Linear Algebra and Feature Selection
with Aleksandar Samsiev and Ivan Manov