A detailed guide to memory usage in R

November 12, 2013
By

(This article was first published on Revolutions, and kindly contributed to R-bloggers)

R is designed as an in-memory application: all of the data you work with must be hosted in the RAM of the machine you're running R on. This optimizes performance and flexibility, but does place contraints on the size of data you're working with (since it must all work in RAM). When working with large data sets in R, it's important to understand how R allocates, duplicates and consumes memory. This guide to R memory usage in Hadley Wickham's forthcoming book Advanced R Programming is a useful for R developers struggling with R's memory usage. It covers how R allocates memory for objects (and how much), the situations in which R makes copies of data (instead of just passing data by reference), and how to track and control the amount of data that R uses.

By contrast, Revolution R Enterprise ScaleR works with data out-of-memory: either on a file on disk, or stored in a database or other data repository. By streaming data through memory rather than loading it all into memory at once, Revolution R Enterprise calculates descriptive statistics, machine learning models, and statistical models on large data sets without limitations on the available RAM and without having to worry about the details of R memory usage. Learn more about Revolution R Enterprise here.  

Advanced R Programming by Hadley Wickham: Memory

To leave a comment for the author, please follow the link and comment on his blog: Revolutions.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...



If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Comments are closed.