Articles by Philipp Probst

Is catboost the best gradient boosting R package?

July 7, 2020 | Philipp Probst

Several R packages that use different methods are out there for using gradient boosting methods. The three most famous ones are currently xgboost, catboost and lightgbm. I want to compare these three to find out which is the best one in their default mode without tuning. These algorithms are not ... [Read more...]

New xgboost defaults

February 25, 2020 | Philipp Probst

xgboost is the most famous R package for gradient boosting and it is since long time on the market. In one of my publications, I created a framework for providing defaults (and tunability measures) and one of the packages that I used there was xgboost. The results provided a default ... [Read more...]

mlr vs. caret

November 8, 2018 | Philipp Probst

Let’s compare the two popular R packages for machine learning mlr and caret. caret is longer on the market, its first CRAN release seems to be from 2007, while mlr came to CRAN on 2013. As for now, caret seems to be more popular, according to cranlogs caret was downloaded 178029 times ... [Read more...]

Tuning random forest

November 22, 2017 | Philipp Probst

Random forest is one of the standard approaches for supervised learning nowadays. One of its advantages is that it does not require tuning of the hyperparameters to perform good. But is that really true? Maybe we are not only interested in a good model but in the best model we ... [Read more...]

Update on Random Forest Package Downloads

April 9, 2017 | Philipp Probst

I just updated the code from a previous post where I analysed the download statistics of different random forest packages in R, see the code at the bottom of the article. I calculated the number of cran downloads in march 2016 and march 2017. Standard random forest The number of download of ... [Read more...]

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)