Estimating Standard Errors for a Logistic Regression Model optimised with Optimx in R

[This article was first published on R | Joshua Entrop, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

In my last post I estimated the point estimates for a logistic regression model using optimx() from the optimx package in R. In this post I would like to contine with this model an try to find the standard error (SE) for the derived estimates. Uncertainty is probably the most important quantity in statistics and therefore I think it is worthwhile to look a lite bit more into this. However, before, we can start with the estimation of the SEs, I would ask you to run the code for deriving the point estimates for the logistic regression using optimx(), which you can find here. This will be the starting point for our further calculations.

When I searched for an answer to solve the problem of estimating the SE using the output of optimx() in R, I came across this quite old email from 2004 on the R-help email list and a discussion on stackexchange. Basically it says that we can compute the covariance matrix as the inverse of the negative of the Hessian matrix. Given our estimated covariance matrix, we can then estimate the SE as the square root of the diagonal elements of our covariance matrix.

So, lets try to implement this in R. First we need to extract the Hessian matrix from our optimx() result object. Note, that you need to set the option hessian = TRUE in your optimx() call. This asks optimx() to estimate the Hessian matrix for the different optimization algorithms and allows us to obtain this information after the optimization is finished. In the example below, I only obtain the Hessian matrix for the optimization algorithm Rcgmin, since it showed the best fit compared to the results from the glm() model.

# 7. Estimate the standard error ----------------------------------------------
#Extract hessian matrix for Rcgmin optimisation
hessian_m <- attributes(opt)$details["Rcgmin", "nhatend"][[1]]

After we extracted the Hessian matrix, we can follow the procedure described above. Also note, that I used the Hessian matrix, instead of the negative Hessian matrix in my example. When I used the negative Hessian matrix, I got negative values for the diagonal values of the inverse. Hence, I was not able to obtain the squared root of these values. Also, I obtained the correct SEs using this approach.

# Estimate se based on hession matrix
fisher_info <- solve(hessian_m)
prop_se <- sqrt(diag(fisher_info))

Now were we obtained our estimates for the SEs, it would be interesting to compare them with the results of a glm() call, that tries to fit the same model as we do.

# Compare the estimated se from our model with the one from the glm
ses <- data.frame(se_Rcgmin = prop_se,
se_glm = tidy(glm_model)$std.error) %>%
print()
## se_Rcgmin se_glm
## 1 0.69888433 0.69884208
## 2 0.01065624 0.01065569
## 3 0.37177192 0.37176526
all.equal(ses[,"se_Rcgmin"], ses[, "se_glm"])
## [1] "Mean relative difference: 4.57513e-05"

The differences between the estimates of the SEs using the Hessian matrix and the glm() model are very small. It seems like our approach did a fairly good job. Hence, we can now use our SE estimates to compute the 95%CIs of our point estimates.

# 8. Estimate 95%CIs using estimation of SE -----------------------------------
# Extracting estimates from Rcgmin optimisaiton
coef_test <- coef(opt)["Rcgmin",]
# Compute 95%CIs
upper <- coef_test + 1.96 * prop_se
lower <- coef_test - 1.96 * prop_se
# Print 95%CIs
data.frame(coef_test, lower=lower, upper=upper, se = prop_se)
## coef_test lower upper se
## p1 -3.05669070 -4.426503993 -1.68687741 0.69888433
## p2 0.02758409 0.006697859 0.04847032 0.01065624
## p3 -0.01131098 -0.739983952 0.71736199 0.37177192

Combining this and my previous post on optimizing a logistic regression using optimx(), we were able to more or less manually obtain the results of a logistic regression model, that we would commonly obtain using the glm() function.

To leave a comment for the author, please follow the link and comment on their blog: R | Joshua Entrop.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)