Articles by EnhanceDataScience

Machine Learning Explained: Dimensionality Reduction

July 31, 2017 | EnhanceDataScience

Dealing with a lot of dimensions can be painful for machine learning algorithms. High dimensionality will increase the computational complexity, increase the risk of overfitting (as your algorithm has more degrees of freedom) and the sparsity of the data will grow. Hence, dimensionality reduction will project the data in a ...
[Read more...]

Machine Learning Explained: Regularization

July 4, 2017 | EnhanceDataScience

Welcome to this new post of Machine Learning Explained.After dealing with overfitting, today we will study a way to correct overfitting with regularization. Regularization adds a penalty on the different parameters of the model to reduce the freedom of the model. Hence, the model will be less likely to ...
[Read more...]

Machine Learning Explained: Overfitting

June 29, 2017 | EnhanceDataScience

Welcome to this new post of Machine Learning Explained.After dealing with bagging, today, we will deal with overfitting. Overfitting is the devil of Machine Learning and Data Science and has to be avoided in all of your models. What is overfitting? A good model is able to learn the ...
[Read more...]

Machine Learning Explained: Bagging

June 28, 2017 | EnhanceDataScience

Bagging is a powerful method to improve the performance of simple models and reduce overfitting of more complex models. The principle is very easy to understand, instead of fitting the model on one sample of the population, several models are fitted on different samples (with replacement) of the population. Then, ...
[Read more...]

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)