mlr 2.10 is now on CRAN. Please update your package if you haven’t done so in a while. Here is an overview of the changes:

mlr 2.10 is now on CRAN. Please update your package if you haven’t done so in a while. Here is an overview of the changes:

A friend asked me whether I can create a loop which will run multiple regression models. She wanted to evaluate the association between 100 dependent variables (outcome) and 100 independent variable (exposure), which means 10,000 regression models. Regression models with multiple dependent (outcome) and independent (exposure) variables are common in genetics. So models will be Related Post

Over the year 2016, I manually gathered press clippings announcing Venture Capital (VC) deals from various online or newsletters public sources each time I bumped into something that caught my attention. Early January, I then put together and cleaned the data and made it R-usable as a csv dataset Read More ...

As I wrote in the previous post, I will continue in describing regression methods, which are suitable for double seasonal (or multi-seasonal) time series. In the previous post about Multiple Linear Regression, I showed how to use “simple” OLS regression method to model double seasonal time series of electricity consumption and use it for accurate forecasting. Interactions...

The results of the study are interesting from an astrological point of view. Astrological signs are divided into groups by order. The first grouping is by alternating order, with the first sign (Aries) positive and the next sign negative, the third positive, and so on through the circle. The second grouping is in groups of Related Post

In the first part I created the data for testing the Astronomical/Astrological Hypotheses. In this part, I started by fitting a simple linear regression model. mod.lm = lm(div.a.ts~vulcan.ts) summary(mod.lm) Call: lm(formula = div.a.ts ~ vulcan.ts) Residuals: Min 1Q Median 3Q Max -159.30 -53.88 -10.37 53.48 194.05 Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 621.23955 Related Post

The assumptions of simple linear regression include the assumption that the errors are independent with constant variance. Fitting a simple regression when the errors are auto-correlated requires techniques from the field of time series. If you are interested in fitting a model to an evenly spaced series where the terms are auto-correlated, I have given Related Post

Similar to the two-group linear discriminant analysis for classification case, LDA for classification into several groups seeks to find the mean vector that the new observation is closest to and assign accordingly using a distance function. The several group case also assumes equal covariance matrices amongst the groups . LDA... The post LDA for Classification into Several Groups...

One of the very first learning algorithms that you’ll encounter when studying data science and machine learning is least squares linear regression. Linear regression is one of the easiest learning algorithms to understand; it’s suitable for a wide array of problems, and is already implemented in many programming languages. Most users are familiar with the Related Post

e-mails with the latest R posts.

(You will not see this message again.)