If you haven’t realized it yet, a new version of lme4 (version 1.0-4) was released recently (Sept. 21). For an end-user like me, there were not many changes, but there were a few:
- No more using the @ operator. After a very helpful email exchange with Ben Bolker, I came to realize that I shouldn’t have been using it in the first place, but I hadn’t figured out all the “accessor” methods that are available (you can get a list using methods(class = “merMod”)). I had been using it in two main contexts:
- To get the fixed effect coefficients with their standard errors, etc. from the summary. A better way to do that is to use coef(summary(m))
- To get model-predicted values. A better way to do that is to use fitted(m), with the added convenience that this returns proportions for logistic models, making the model fits easier (and, I think, more intuitive) to visualize. By the way, a predict() method has now been implemented, which provides an easy way to get model predictions for new data.
- There have been some changes to the optimization algorithms and some of my models that used to run fine are now giving me some convergence warnings. This seems to happen particularly for linear models with within-subject manipulations. Using the bobyqa optimizer instead of the default Nelder-Mead optimizer seems to fix the problem. This can be done by adding control=lmerControl(optimizer = “bobyqa”) to the call to lmer. A minor related point: the release notes (https://github.com/lme4/lme4/blob/master/misc/notes/release_notes.md) state that the internal computational machinery has changed, so the results will not be numerically identical, though they should be very close for reasonably well-defined fits. I have found this to be true for a reasonably large set of models that I’ve re-run.
- When fitting logistic models, if you use lmer(…, family=”binomial”), it will call glmer() as before, but now also warns you that you should probably be using glmer() directly.