Showing posts with label LINEAR REGRESSION. Show all posts
Showing posts with label LINEAR REGRESSION. Show all posts

Wednesday, 29 June 2022

Primary Supervised Learning Algorithms Used in Machine Learning by Kevin Vu via @kdnuggets

In this tutorial, they are going to list some of the most common algorithms that are used in supervised learning along with a practical tutorial on such algorithms.

This is really useful and worth a bookmark or printout.

Friday, 17 June 2022

Explaining negative R-squared by Tan Nian Wei via @TDataScience

Why and when does R-squared, the coefficient of determination, go below zero?

Interesting and good to be able to confirm it is how I thought it was.

Monday, 13 September 2021

Intro to Object-Oriented Programming For Data Scientists by @BexTuychiev via @towardsdev

Implement a simple Linear Regression with OOP basics on your own.

I love this and the clear examples of code. Object-oriented programming might appear to be extra work but if you have a long or complicated piece of code it is very useful to standardise exactly what you are doing and how. 

Wednesday, 18 August 2021

3 Reasons Why You Should Use Linear Regression Models Instead of Neural Networks by Terence Shin via @kdnuggets

While there may always seem to be something new, cool, and shiny in the field of AI/ML, classic statistical methods that leverage machine learning techniques remain powerful and practical for solving many real-world business problems.

Some really good points in this article that make sense if you think about it a bit more. I particularly like point #2 as anything that makes it easier to communicate with others definitely gets my vote.

Wednesday, 21 March 2018

A Tour of The Top 10 Algorithms for Machine Learning Newbies by James Le via @kdnuggets

For machine learning newbies who are eager to understand the basic of machine learning, here is a quick tour on the top 10 machine learning algorithms used by data scientists.

This is great and definitely worth a bookmark.  Please note this is over 2 pages.

Sunday, 5 November 2017

Tuesday, 14 February 2017

Predictive Analytics 101 by @data36_com

If you have basic R or Python skills, you can build a simple predictive model. These two posts show you how:

Part one

Part two

I recommend you sign up for his newsletter here

Friday, 10 February 2017

WEBINAR: Webinar: Improve Your Regression with CART and Gradient Boosting - 16 February 2017

Salford Systems

Improve Your Regression with CART and Gradient Boosting

Join us for our upcoming webinar:

Date: Thursday, February 16, 2017
Time: 1 pm EST, 10 am PST
Can't make it at this time? Register to receive a recorded copy of the webcast and presentation slides, which we will email out a few days after the live event.
Duration: 55 minutes
Speaker: Charles Harrison, Marketing Statistician, Salford Systems
Cost: Free 

Abstract: In this webinar we'll introduce you to a powerful tree-based machine learning algorithm called gradient boosting. Gradient boosting often outperforms linear regression, Random Forests, and CART. Boosted trees automatically handle variable selection, variable interactions, nonlinear relationships, outliers, and missing values.

We'll see that CART decision trees are the foundation of gradient boosting and discuss some of the advantages of boosting versus a Random Forest. We will explore the gradient boosting algorithm and discuss the most important modeling parameters like the learning rate, number of terminal nodes, number of trees, loss functions, and more. We will demonstrate using an implementation of gradient boosting (TreeNet® Software) to fit the model and compare the performance to a linear regression model, a CART tree, and a Random Forest.

Register here

Friday, 6 November 2015

Using Linear Regression to Predict Energy Output of a Power Plant via @datascienceplus

Teja Kodali describes how to use linear regression to predict the energy output of a power plant on Data Science +

Contains R code and a walked example.