Site icon R-bloggers

2020 recap, Gradient Boosting, Generalized Linear Models, AdaOpt with nnetsauce and mlsauce

[This article was first published on T. Moudiki's Webpage - R, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

A few highlights from 2020 in this blog include:

What are AdaOpt, LSBoost and nnetsauce’s GLMs?

In general, and not only for GLMs, the best way to read nnetsauce things is: https://thierrymoudiki.github.io/blog/#QuasiRandomizedNN. In #QuasiRandomizedNN, you’ll find nnetsauce’s posts you might have missed. For example, this one, in which nnetsauce’s MultitaskClassifier perfectly classifies penguins (in R).

I can see that nnetsauce and mlsauce are downloaded thousands of times each month. But that’s not the most important thing to me! If you’re using mlsauce, nnetsauce or any other tool presented in this blog, feel free and do not hesitate to contribute, or star the repository. That way, we could create and keep alive a cool community around these tools. That’s ultimately the most important thing to me.

Best wishes for 2021!

To leave a comment for the author, please follow the link and comment on their blog: T. Moudiki's Webpage - R.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.