A Quick Note in Weighting with nlme

[This article was first published on i'm a chordata! urochordata! » R, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

I’ve been doing a lot of meta-analytic things lately. More on that anon. But one quick thing that came up was variance weighting with mixed models in R, and after a few web searches, I wanted to post this, more as a note-to-self and others than anything. Now, in a simple linear model, weighting by variance or sample size is straightforward.

#variance
lm(y ~ x, data = dat, weights = 1/v)
 
#sample size
lm(y ~ x, data = dat, weights = n)

You can use the same sort of weights argument with lmer. But, what about if you’re using nlme? There are reasons to do so. Things change a bit, as nlme uses a wide array of weighting functions for the variance to give it some wonderful flexibility – indeed, it’s a reason to use nlme in the first place! But, for such a simple case, to get the equivalent of the above, here’s the tricky little difference. I’m using gls, generalized least squares, but this should work for lme as well.

#variance
gls(y ~ x, data=dat, weights = ~v)
 
#sample size
gls(y ~ x, data = dat, weights = ~1/n)

OK, end note to self. Thanks to John Griffin for prompting this.

To leave a comment for the author, please follow the link and comment on their blog: i'm a chordata! urochordata! » R.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)