Are you aware of the basic principle behind K-Fold cross validation - for reducing selection bias significantly and validating the model to be not over-fit. Here's a guide with codes in R & Python to help you with it which ties in nicely with Kaggle and competition entries on there.
Very useful to read and follow.
No comments:
Post a Comment
Note: only a member of this blog may post a comment.