K-Fold Cross validation: Random Forest vs GBM

[This article was first published on R Video Blog! , and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

K-Fold Cross validation: Random Forest vs GBM from Wallace Campbell on Vimeo.

In this video, I demonstrate how to use k-fold cross validation to obtain a reliable estimate of a model's out of sample predictive accuracy as well as compare two different types of models (a Random Forest and a GBM). I use data Kaggle's Amazon competition as an example.

To leave a comment for the author, please follow the link and comment on their blog: R Video Blog! .

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)