Excellent. we need more courses like this. and we would want courses to help us know which ML algorithms would be best suited for a particular problem
Decision trees and random forests are tools that every data scientist or machine learning practitioner should be familiar with. Famous for producing good predictors, these methods are also indispensable when it comes to understanding the problem at hand, as well as visualizing and communicating your results. That’s why we have prepared this course for you. The first part features a thorough explanation of the workings of decision trees, how to code and visualize them with sklearn, and the pros and cons you should consider. Then we will build on the concept of a single decision tree to produce the random forest algorithm. Finally, we will cap it all off with a practical example implementing both decision trees and random forests in Python to predict a person’s income based on census data.
This practical course introduces you to the inner workings of decision trees and random forests. You will learn how a tree is constructed and how the concept of decision trees extends to random forests, as well as how these methods can be applied in several different practical examples.
In this introductory section, you will get to know your instructor, go over the contents of the course, and discover why mastering ML with Decision Trees and Random forests is essential for progressing your predictive analytics skillset.
This is the main section of the course where we will use visual examples to make sense of the concept of decision trees. We will cover the advantages and disadvantages of this method and find out what goes into building decision tree models. You will also learn about a popular technique known as tree pruning. In order to apply your newly found skills, you will be diving into a practical example of how to create decision trees with sklearn.
The final section of this course is dedicated to the random forest algorithm. We will learn about bootstrapping and bagged decision trees – all steps towards the creation of a random forest. It is important to understand the distinction in applications between decision trees and random forests, so this is included as well. Finally, we conclude this section and the course with a comprehensive case study. The first half of our practical example is dedicated to showing you how to implement random forests in sklearn. After that, we will model a person’s salary based on various census features. We will create both a decision tree and a random forest model for this dataset and compare the performance of each.
“The ability to interpret a model’s results is indispensable in Machine Learning. That’s where decision trees come into play. The decision tree model is relatively simple and easy to understand – both characteristics that make it a great foundation for ML enthusiasts to grasp and visualize basic concepts.”
Silver medal at Physics Olympiad
Machine Learning with Decision Trees and Random Forests
with Nikola Pulev