The Machine Learning Algorithms A-Z

with Jeff Li and Ken Jee
4.7/5
(582)

Master the core concepts of popular ML algorithms: understand when and how to apply different machine learning techniques effectively

5 hours of content 8282 students

$99.00

Lifetime access

Buy now
14-Day Money-Back Guarantee

What you get:

  • 5 hours of content
  • 108 Interactive exercises
  • World-class instructor
  • Closed captions
  • Q&A support
  • Future course updates
  • Course exam
  • Certificate of achievement

The Machine Learning Algorithms A-Z

A course by Jeff Li and Ken Jee

$99.00

Lifetime access

Buy now
14-Day Money-Back Guarantee

What you get:

  • 5 hours of content
  • 108 Interactive exercises
  • World-class instructor
  • Closed captions
  • Q&A support
  • Future course updates
  • Course exam
  • Certificate of achievement

$99.00

Lifetime access

Buy now

$99.00

Lifetime access

Buy now
14-Day Money-Back Guarantee

What you get:

  • 5 hours of content
  • 108 Interactive exercises
  • World-class instructor
  • Closed captions
  • Q&A support
  • Future course updates
  • Course exam
  • Certificate of achievement

What You Learn

  • Acquire machine learning skills that bridge theoretical knowledge and practical application
  • Gain insight into the strength and limitations of various machine learning models
  • Build confidence in selecting the optimal machine learning algorithm needed for specific use cases
  • Understand fundamental machine learning theory that will help you achieve outstanding real-world results
  • Acquire the business intuition to discern whether a problem requires machine learning or can be solved through simpler analytical methods
  • Improve your career prospects with in-demand machine learning skills, essential for your success in an AI-driven world

Top Choice of Leading Companies Worldwide

Industry leaders and professionals globally rely on this top-rated course to enhance their skills.

Course Description

Looking to break into machine learning? Need to review the ins and outs of each algorithm? Preparing for an interview? Curious to see how these algorithms are applied to business? As ML practitioners, the true value of ML is not in memorizing complicated formulas. It’s not using the latest, greatest deep learning architecture. It’s knowing when to use an algorithm and how to maximize the impact of that model. It’s knowing how to use them to solve REAL business problems. The true value of ML is not ML. It’s solving important business problems. Whether you need to build a forecasting model that forms the backbone of the ads business, builld a recommender system powering millions of purchases or build a fraud detection system that catches bad apples, this course will arm you with both the ML knowledge AND the know-how on how to apply it to your business problem. We don’t want you to leave this course just knowing ML. We want you to leave this course to leave as ML practitioners.

Learn for Free

Introduction

1.1 Introduction

2 min

ML Algorithms course - GitHub repository

1.2 ML Algorithms course - GitHub repository

1 min

How to Use this Course

1.3 How to Use this Course

1 min

Types of ML Problems

1.4 Types of ML Problems

1 min

Additional Resources

1.6 Additional Resources

1 min

Linear Regression

2.1 Linear Regression

1 min

Curriculum

  • 1. Course Introduction
    5 Lessons 6 Min

    In this introductory section, you’ll learn which of the course is best for you, the types of ML issues businesses need to solve, and additional resources from the course authors.

    Introduction
    2 min
    ML Algorithms course - GitHub repository Read now
    1 min
    How to Use this Course
    1 min
    Types of ML Problems
    1 min
    Additional Resources
    1 min
  • 2. Linear Regression
    17 Lessons 24 Min

    Linear regression is the most dynamic model out of all we review. It’s an exceptional framework for making predictions and extracting insight into relationships between variables.

    Linear Regression
    1 min
    Real World Business Problems
    1 min
    Example: Linear Regression
    1 min
    Intuition: Linear Regression
    1 min
    Training Step-by-Step: Linear Regression
    2 min
    Prediction: Linear Regression
    1 min
    Assumptions: Linear Regression
    1 min
    Assumption #1: Model is linear in coefficients and error terms
    1 min
    Assumption #2: Homoscedasticity
    1 min
    Assumption #3: Multicollinearity
    3 min
    Assumption #4: Independence/Autocorrelation
    1 min
    Assumption #5: Normally Distributed Error Terms
    1 min
    Assumption #6: Outliers
    1 min
    Inference - Interpreting Output
    3 min
    AB Testing Example
    1 min
    ML Process: Linear Regression
    2 min
    Pros & Cons, When to Use
    2 min
  • 3. Ridge, Lasso, Elastic Net
    12 Lessons 14 Min

    One way to limit some of the negatives of linear regression is via regularization. We can regularize a linear regression model using ridge, lasso, and elastic net algorithms.

    Ridge, Lasso, Elastic Net
    1 min
    Intuition: Ridge, Lasso, Elastic Net
    1 min
    Plain Definition: Ridge, Lasso, Elastic Net
    1 min
    Shrinkage Methods vs. Feature Selection
    1 min
    Step-by-Step Intuition: Ridge, Lasso, Elastic Net
    2 min
    Lasso Regression (L1)
    1 min
    Ridge Regression (L2)
    1 min
    ElasticNet (L1 + L2)
    1 min
    Determining the Degree of Regularization
    1 min
    Difference between Lasso & Ridge
    1 min
    Link to resources
    1 min
    When to use: Ridge, Lasso, Elastic Net
    2 min
  • 4. Logistic Regression
    14 Lessons 22 Min

    Like linear regression, logistic regression is one of the most powerful and straightforward models.

    Introduction: Logistic Regression
    1 min
    Example: Logistic Regression
    1 min
    Intuition: Logistic Regression
    2 min
    Real World Business Problems: Logistic Regression
    1 min
    What is Logit
    1 min
    Step-by-Step Prediction: Logistic Regression
    1 min
    Step-by-Step Training: Logistic Regression
    3 min
    Assumptions: Logistic Regression
    2 min
    Understanding Logistic Regression Output
    3 min
    Maximum Likelihood Explained
    2 min
    Log Loss
    1 min
    Predicting Multiple Classes using Multinomial Logistic Regression
    1 min
    ML Process: Logistic Regression
    2 min
    ProsCons, When to Use
    1 min
  • 5. Gradient Descent
    9 Lessons 10 Min

    Gradient descent is an optimization algorithm that powers many of our ML algorithms. Think of it as a helper algorithm, enabling us to find the best formulation of our ML model.

    Gradient Descent
    1 min
    Intuition: Gradient Descent
    1 min
    Plain Definition: Gradient Descent
    2 min
    Step-by-Step: Gradient Descent
    1 min
    Assumptions: Gradient Descent
    1 min
    Parameter Tuning (Step size, Alpha)
    1 min
    Gradient Descent Pros and Cons
    1 min
    Stochastic Gradient Descent
    1 min
    Pros and Cons: Gradient Descent
    1 min
  • 6. Decision Trees
    14 Lessons 18 Min

    Decision trees work for classification and regression problems. While individual decision trees don’t typically produce the best predictive outcomes in real life, they are highly interpretable.

    Decision Trees
    1 min
    Example: Decision Trees
    1 min
    Plain Explanation: Decision Trees
    1 min
    Different Components of Decision Trees Explained
    1 min
    Real World Business Example: Decision Trees
    1 min
    Assumptions: Decision Trees
    1 min
    Training Step-by-Step: Decision Trees
    4 min
    Prediction Step-by-Step: Decision Trees
    1 min
    Additional Metrics: Decision Trees
    1 min
    Tuning the Parameters: Decision Trees
    1 min
    ML Process: Decision Trees
    2 min
    Assumptions: Decision Trees
    1 min
    Pros and Cons: Decision Trees
    1 min
    When to Use Decision Trees.
    1 min
  • 7. Random Forest
    14 Lessons 16 Min

    Random forest is one of the most popular models for classification and regression. It works exceedingly well with datasets with many categorical values or a mix of categorical and continuous variables.

    Random Forest
    1 min
    Intuition: Random Forest
    1 min
    Example: Random Forest
    1 min
    Real World Business Problems: Random Forest
    1 min
    Plain Definition: Bagging
    1 min
    Where Bagging Fails
    2 min
    Plain Definition: Random Forest
    1 min
    Step-by-Step (Training): Random Forest
    1 min
    Step-by-Step (Prediction): Random Forest
    1 min
    How Random Forest give us Feature Importance
    1 min
    Out of Bag Error
    1 min
    ML Process: Random Forest
    2 min
    When to use: Random Forest
    1 min
    Pros and Cons: Random Forest
    1 min
  • 8. Gradient Boosted Trees
    13 Lessons 17 Min

    Gradient-boosted trees are a way for our models to learn from their mistakes. Unlike random forest—where we grow our trees in parallel, then aggregate the results—with gradient-boosted trees, we grow trees sequentially.

    Gradient Boosted Trees
    1 min
    Example: Gradient Boosted Trees
    1 min
    Real World Business Problems: Gradient Boosted Trees
    1 min
    Plain Definition: Gradient Boosted Trees
    1 min
    Terminology: Gradient Boosted Trees
    1 min
    Assumptions: Gradient Boosted Trees
    1 min
    Training Step-by-Step (Regression): Gradient Boosted Trees
    1 min
    Training Step-by-Step (Classification): Gradient Boosted Trees
    2 min
    Prediction Step-by-Step: Gradient Boosted Trees
    1 min
    What does the “Gradient” mean
    1 min
    How Gradient Boosted Trees give us Feature Importance
    1 min
    ML Process: Gradient Boosted Trees
    3 min
    When to use Gradient Boosted Trees
    2 min
  • 9. XGBoost
    8 Lessons 15 Min

    XGBoost is a famous open-source gradient-boosting algorithm for supervised learning tasks, including regression and classification problems. Reducing errors during each iteration makes better predictions by combining several decision trees.

    Intuition: XGBoost
    3 min
    Real World Business Problems: XGBoost
    1 min
    Plain Definition: XGBoost
    1 min
    XGBoost Algorithm Improvements
    4 min
    System Improvements
    2 min
    ML Process: XGBoost
    2 min
    When to use XGBoost
    1 min
    Pros and Cons: XGBoost
    1 min
  • 10. K Nearest Neighbors
    9 Lessons 15 Min

    K-nearest neighbors is a popular machine learning algorithm for classifying and predicting data. It works by finding the k closest data points to a new, unknown data point and categorizing it based on the most common class among its neighbors.

    Intuition: KNN
    1 min
    Example: KNN
    3 min
    Plain Definition: KNN
    1 min
    Assumptions : KNN
    1 min
    Training Step-by-Step: KNN
    1 min
    Prediction Step-by-Step: KNN
    1 min
    Tuning Parameters: KNN
    4 min
    ML Process: KNN
    1 min
    When to use KNN
    2 min
  • 11. K-Means Clustering
    12 Lessons 16 Min

    K-means clustering is the first unsupervised learning method we will introduce. As a reminder, unsupervised learning means that our data doesn’t have specific target labels we try to classify. Instead, we consider data as a whole and see algorithmically what groups can be made from it.

    Intuition: K-Means Clustering
    1 min
    Example: K-Means Clustering
    1 min
    Plain Definition: K-Means Clustering
    1 min
    Real World Business Problems: K-Means Clustering
    1 min
    Step-by-Step Training: K-Means Clustering
    1 min
    Selecting K
    2 min
    Silhouette Method
    1 min
    Hard Clustering vs- Soft Clustering
    1 min
    Derivatives of K-Means
    1 min
    Assumptions: K-Means Clustering
    1 min
    ML Process: K-Means Clustering
    3 min
    When do we use K means Clustering
    2 min
  • 12. Hierarchical Clustering
    9 Lessons 10 Min

    Hierarchical clustering is similar to how you organize files into folders on your computer. Whenever we organize our files into their folders, we perform hierarchical clustering.

    Intuition: Hierarchical Clustering
    1 min
    Real World Business Problems: Hierarchical Clustering
    1 min
    Definition: Hierarchical Clustering
    1 min
    Step-by-Step Agglomerative Clustering
    1 min
    Linkages
    1 min
    Distance
    1 min
    ML Process: Hierarchical Clustering
    2 min
    Pros and Cons: Hierarchical Clustering
    1 min
    When to Use: Hierarchical Clustering
    1 min
  • 13. Support Vector Machines
    14 Lessons 18 Min

    Support vector machines (SVMs) work by identifying a hyperplane that best separates the data into different classes or predicts the target variable for regression. It also aims to find the optimal hyperplane that maximizes the margin between the classes while minimizing the misclassification error.

    Intuition: SVM
    1 min
    Real World Business Problems: SVM
    1 min
    Step-by-Step Training (Non-Technical): SVM
    3 min
    Loss Function
    1 min
    Nonlinear Data
    1 min
    Prediction (Step-by-Step): SVM
    1 min
    Terminology: SVM
    1 min
    Assumptions: SVM
    1 min
    Soft vs Hard Margins: SVM
    1 min
    How to use SVMs as a multi-class classifier
    2 min
    How does SVM Regression Work
    1 min
    ML Process: SVM
    2 min
    Pros & Cons (Classifier): SVM
    1 min
    When to use an SVM Classifier
    1 min
  • 14. Artificial Neural Nets
    12 Lessons 37 Min

    Artificial neural nets (ANNs) are machine learning algorithms based on the structure and function of biological neurons. Layers of interconnected nodes process and transform input data to produce predictions. The neural network learns by adjusting the weights and biases of the connections between the nodes based on the error of the predictions.

    Intuition: Artificial Neural Nets
    2 min
    Real world Business Problems: Artificial Neural Nets
    1 min
    Example: Artificial Neural Nets
    5 min
    Multi-Layered Networks: Artificial Neural Nets
    2 min
    Classification - Activation Layers
    2 min
    Vanishing Gradient Problem
    4 min
    Activation Layers
    3 min
    Embeddings
    5 min
    Types of ANNs
    7 min
    Transfer Learning
    1 min
    ML Process: Artificial Neural Nets
    2 min
    Pros and Cons: Artificial Neural Nets
    3 min
  • 15. Collaborative Filtering - Non-Negative Matrix Factorization
    15 Lessons 18 Min

    Collaborative filtering uses ratings from a group of (collaborative) users to infer the preference of another (filtering) user.

    Intuition: Collaborative Filtering
    1 min
    Plain Definition: Collaborative Filtering
    1 min
    Real world Business Problems: Collaborative Filtering
    1 min
    Assumptions: Collaborative Filtering
    1 min
    Different Approaches to Collaborative Filtering
    1 min
    Matrix Factorization Intuition
    1 min
    Matrix Factorization Definition
    1 min
    Assumptions: NMF
    1 min
    Step-by-Step (prediction): Collaborative Filtering
    1 min
    Step-by-Step (training): Collaborative Filtering
    2 min
    Determining the ideal number of latent variables
    1 min
    Addressing the Cold-Start Problem
    2 min
    ML Process: Collaborative Filtering
    2 min
    ProsCons: Collaborative Filtering
    1 min
    When to use NMF
    1 min
  • 16. Naïve Bayes
    10 Lessons 19 Min

    Discover the intricacies of the Naïve Bayes algorithm, exploring Bayes Theorem, its intuitive definition, why it's called "Naïve", the ML process, the pros and cons of this algorithm, a real-life business example, and effective use cases.

    Bayes Theorem
    4 min
    Intuition and Plain Definition
    1 min
    Step-By-Strp Explanation - First Part
    7 min
    Step-by-step Explanation - Second Part
    1 min
    Why is Naive Baïve called Naïve?
    1 min
    The types of Naïve Bayes
    1 min
    ML Process: Naïve Bayes
    1 min
    Pros and Cons: Naïve Bayes
    1 min
    Real-Life Business Example
    1 min
    When to use Naïve Bayes
    1 min
  • 17. Practical projects
    2 Lessons 76 Min

    In this section you can find practical projects that will help you enhance your skills.

    Regression project
    39 min
    Classification project
    37 min

Topics

Random ForestXGBoostK Nearest Neighborshierarchical clusteringGradient Boosted TreesGradient Descentk-means clusteringSupport Vector Machinesmachine learningNaïve BayesCollaborative FilteringArtificial Intelligencedecision trees

Tools & Technologies

python

Course Requirements

  • You need to complete an introduction to Python before taking this course
  • Basic skills in statistics, probability, and linear algebra are required
  • It is highly recommended to take the Machine Learning in Python course first
  • You will need to install the Anaconda package, which includes Jupyter Notebook

Who Should Take This Course?

Level of difficulty: Advanced

  • Aspiring data scientists and ML engineers
  • Existing data scientists and ML engineers who want to boost their skills and learn from world-class experts

Exams and Certification

A 365 Data Science Course Certificate is an excellent addition to your LinkedIn profile—demonstrating your expertise and willingness to go the extra mile to accomplish your goals.

Exams and certification

Meet Your Instructor

Jeff Li

Jeff Li

Senior Data Scientist at

3 Courses

1742 Reviews

24882 Students

Jeff is a senior data scientist at a large music streaming platform and is focused on forecasting problems surrounding ads. He got into data science by trying to earn a living playing poker and previously spent two years at Door Dash on their core ML team (he was working on a wide variety of problems such as improving experimentation power, personalization, and supply/demand forecasting). In his courses with 365, Jeff is willing to share valuable practical insights he learned on the job. Prior to starting his data science career, Jeff worked in technology consulting. He graduated the University of Southern California.

What Our Learners Say

365 Data Science Is Featured at

Our top-rated courses are trusted by business worldwide.