I liked this which was nice and easy to understand if you aren't great at maths and formulas.
This is a blog containing data related news and information that I find interesting or relevant. Links are given to original sites containing source information for which I can take no responsibility. Any opinion expressed is my own.
Showing posts with label RANDOM FOREST. Show all posts
Showing posts with label RANDOM FOREST. Show all posts
Monday, 15 August 2022
Decision Trees vs Random Forests, Explained by Natassha Selvaraj via @kdnuggets
A simple, non-math-heavy explanation of two popular tree-based machine learning models.
Wednesday, 2 March 2022
Decision Tree Algorithm, Explained by Nagesh Singh Chauhan via @kdnuggets
All you need to know about decision trees and how to build and optimize decision tree classifiers.
A very clear and easy to understand guide that you might want to share with any folks that need the detailed information in it.
Saturday, 10 June 2017
How the random forest algorithm works in machine learning by @saimadhup via @dataaspirant
This is a great article by Saimadhu Polamuri which is a good explanation of how Random Forest works.
Definitely work reading. Contains some great diagrams.
Definitely work reading. Contains some great diagrams.
Friday, 10 February 2017
WEBINAR: Webinar: Improve Your Regression with CART and Gradient Boosting - 16 February 2017
Improve Your Regression with CART and Gradient Boosting
Join us for our upcoming webinar:
Date: Thursday, February 16, 2017
Time: 1 pm EST, 10 am PST
Can't make it at this time? Register to receive a recorded copy of the webcast and presentation slides, which we will email out a few days after the live event.
Duration: 55 minutes
Speaker: Charles Harrison, Marketing Statistician, Salford Systems
Cost: Free
Abstract: In this webinar we'll introduce you to a powerful tree-based machine learning algorithm called gradient boosting. Gradient boosting often outperforms linear regression, Random Forests, and CART. Boosted trees automatically handle variable selection, variable interactions, nonlinear relationships, outliers, and missing values.
We'll see that CART decision trees are the foundation of gradient boosting and discuss some of the advantages of boosting versus a Random Forest. We will explore the gradient boosting algorithm and discuss the most important modeling parameters like the learning rate, number of terminal nodes, number of trees, loss functions, and more. We will demonstrate using an implementation of gradient boosting (TreeNet® Software) to fit the model and compare the performance to a linear regression model, a CART tree, and a Random Forest.
Register here
Subscribe to:
Posts (Atom)