R Tutorial Series: Regression With Categorical Variables

February 1, 2010
By

(This article was first published on R Tutorial Series, and kindly contributed to R-bloggers)

Categorical predictors can be incorporated into regression analysis, provided that they are properly prepared and interpreted. This tutorial will explore how categorical variables can be handled in R.

Tutorial Files

Before we begin, you may want to download the sample data (.csv) used in this tutorial. Be sure to right-click and save the file to your R working directory. Note that all code samples in this tutorial assume that this data has already been read into an R variable and has been attached. This dataset contains variables for the following information related to NFL quarterback and team salaries in 1991.
  • TEAM: Name of team
  • QB: Starting quarterback salary in thousands of dollars
  • TOTAL: team salary in thousands of dollars
  • CONF: conference (NFC or AFC)
In this dataset, the CONF variable is categorical. It can take on one of two values, either NFC or AFC. Suppose for the purposes of this tutorial that our research question is "how well do quarterback salary and conference predict total team salary?" The model that we use to answer this question will need to incorporate the categorical predictor for conference.

Dummy Coding

To be able to perform regression with a categorical variable, it must first be coded. Here, I will use the as.numeric(VAR) function, where VAR is the categorical variable, to dummy code the CONF predictor. As a result, CONF will represent NFC as 1 and AFC as 0. The sample code below demonstrates this process.
  1. > #represent a categorical variable numerically using as.numeric(VAR)
  2. > #dummy code the CONF variable into NFC = 1 and AFC = 0
  3. > dCONF <- as.numeric(CONF) - 1
Note that the -1 that comes after the as.numeric(CONF) function causes the variables to read 1 and 0 rather than 2 and 1, which is the default behavior.

Interpretation

Visual

One useful way to visualize the relationship between a categorical and continuous variable is through a box plot. When dealing with categorical variables, R automatically creates such a graph via the plot() function (see Scatterplots). The CONF variable is graphically compared to TOTAL in the following sample code.
  1. > #use the plot() function to create a box plot
  2. > #what does the relationship between conference and team salary look like?
  3. > plot(CONF, TOTAL, main="Team Salary by Conference", xlab="Conference", ylab="Salary ($1,000s)")
The resulting box plot is show below.

From a box plot, we can derive many useful insights, such as the minimum, maximum, and median values. Our box plot of total team salary on conference suggests that, compared to AFC teams, NFC teams have slightly higher salaries on average and the range of these salaries is larger.

Routine Analysis

Once a categorical variable has been quantified, it can be used in routine analyses, such as descriptive statistics and correlations. The following code depicts a few examples.
  1. > #what are the mean and standard deviation of conference?
  2. > mean(dCONF)
  3. > [1] 0.5
  4. > sd(dCONF)
  5. > [1] 0.5091751
  6. > #this makes sense… there are an even number of teams in both conferences and they are coded as either 0 or 1!
  7. > #what is the correlation between total team salary and conference?
  8. > cor(dCONF, TOTAL)
  9. > [1]0.007019319
The correlation between total team salary and conference indicates that there is little to no linear relationship between the variables.

Linear Regression

Let's return to our original question of how well quarterback salary and conference predict team salary. With the categorical predictor quantified, we can create a regression model for this relationship, as demonstrated below.
  1. > #create a linear model using lm(FORMULA, DATAVAR)
  2. > #predict team salary using quarterback salary and conference
  3. linearModel <- lm(TOTAL ~ QB + dCONF, datavar)
  4. #generate model summary
  5. summary(linearModel)
The model summary is pictured below.

Considering both the counterintuitive and statistically insignificant results of this model, our analysis of the conference variable would likely end or change directions at this point. However, there is one more interpretation method that is worth mentioning for future reference.

Split Model

With a dummy coded predictor, a regression model can be split into two halves by substituting in the possible values for the categorical variable. For example, we can think of our model as a regression of total salary on quarterback salary for two states of the world - teams in the AFC and teams in the NFC. These derivative models are covered in the following sample code.
  1. > #input the categorical values to split the linear model into two representations
  2. > #the original model: TOTAL = 19099 + 2.5 * QB - 103 * dCONF
  3. > #substitute 0 for dCONF to derive the AFC model: TOTAL = 19099 + 2.5 * QB
  4. > #substitute 1 for dCONF to derive the NFC model: TOTAL = 18996 + 2.5 * QB
  5. #what is the predicted salary for a team with a quarterback salary of $2,000,000 in the AFC and NFC conferences?
  6. #AFC prediction
  7. 19099 + 2.5 * 2000
  8. [1] 24099
  9. #NFC prediction
  10. 18996 + 2.5 * 2000
  11. [1] 23996
Based only on what we have modeled, we can further infer that conference was not a significant predictor of total team salaries in the NFL in 1991. The difference between the team salaries based on conference is less than one-half of one percent on average! Of course, only using quarterback salary and conference to predict an NFL team's overall salary is neglecting quite a few potentially significant predictors. Nonetheless, split model interpretation is a useful way to break down the perspectives captured by a categorical regression model.

More On Categorical Predictors

Certainly, much more can be done with categorical variables than the basic dummy coding that was demonstrated here. Individuals whose work requires a deeper inspection into the procedures of categorical regression are encouraged to seek additional resources (and to consider writing a guest tutorial for this series).

Complete Categorical Regression Example

To see a complete example of how a categorical regression model can be created in R, please download the categorical regression example (.txt) file.

References

The Associated Press. (1991). Q-back and team salaries [Data File]. Retrieved December 14, 2009 from http://lib.stat.cmu.edu/DASL/Datafiles/qbacksalarydat.html

To leave a comment for the author, please follow the link and comment on his blog: R Tutorial Series.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...



If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Comments are closed.