Memory Management in R: A Few Tips and Tricks

[This article was first published on Jeromy Anglim's Blog: Psychology and Statistics, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

This post discusses a few strategies that I have used to to manage memory in  R.
Stack Overflow Tips
Stack Overflow has a thread on Memory Management Tricks. I tend to follow these suggestions:
  • .ls.objects(): There’s a nice function (.ls.objects()) that lists the memory usage of the objects in the workspace using the most memory. It’s good for flagging memory hogging objects that can be deleted.
  • Use scripts: Hadley Wickham suggests recording all R actions as a script and rerunning the script to restore all objects and thus remove temporary objects created in the process of programming the script.
  • Import and Save: Josh Reich mentions the strategy of importing data and then saving these imported objects to disk (see post for details).
Additional Tricks that I use


Develop code on subset of data: I’ve recently been processing logs of key presses from an experiment on skill acquisition. There are a million records. In order to speed up the process of testing and developing my code, I extract a subset of the data for the purposes of writing the code. A lot of people use this approach within the model testing area where models on the full dataset would take hours to run. Thus, the strategy is to build the model on a subset and then run it on the full dataset.

A tweaked version .ls.objects:
I slightly tweaked the .ls.objects() function. I find it useful to see the size of objects in terms of megabytes. Thus, when I run into the issue of using too much memory, I’ll run this function and see if any of the objects using a lot of memory should be removed from the workspace (optionally saving to disk first).

.ls.objects <- function (pos = 1, pattern, order.by = "Size", decreasing=TRUE, head = TRUE, n = 10) {
  # based on postings by Petr Pikal and David Hinds to the r-help list in 2004
  # modified by: Dirk Eddelbuettel (http://stackoverflow.com/questions/1358003/tricks-to-manage-the-available-memory-in-an-r-session) 
  # I then gave it a few tweaks (show size as megabytes and use defaults that I like)
  # a data frame of the objects and their associated storage needs.
  napply <- function(names, fn) sapply(names, function(x)
          fn(get(x, pos = pos)))
  names <- ls(pos = pos, pattern = pattern)
  obj.class <- napply(names, function(x) as.character(class(x))[1])
  obj.mode <- napply(names, mode)
  obj.type <- ifelse(is.na(obj.class), obj.mode, obj.class)
  obj.size <- napply(names, object.size) / 10^6 # megabytes
  obj.dim <- t(napply(names, function(x)
            as.numeric(dim(x))[1:2]))
  vec <- is.na(obj.dim)[, 1] & (obj.type != "function")
  obj.dim[vec, 1] <- napply(names, length)[vec]
  out <- data.frame(obj.type, obj.size, obj.dim)
  names(out) <- c("Type", "Size", "Rows", "Columns")
  out <- out[order(out[[order.by]], decreasing=decreasing), ]
  if (head)
    out <- head(out, n)
  out
}

Additional Resources

To leave a comment for the author, please follow the link and comment on their blog: Jeromy Anglim's Blog: Psychology and Statistics.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)