K-Fold Cross validation: Random Forest vs GBM

September 25, 2013

(This article was first published on R Video Blog! , and kindly contributed to R-bloggers)

K-Fold Cross validation: Random Forest vs GBM from Wallace Campbell on Vimeo.

In this video, I demonstrate how to use k-fold cross validation to obtain a reliable estimate of a model's out of sample predictive accuracy as well as compare two different types of models (a Random Forest and a GBM). I use data Kaggle's Amazon competition as an example.

To leave a comment for the author, please follow the link and comment on their blog: R Video Blog! .

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...

If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Comments are closed.

Search R-bloggers


Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)