Site icon R-bloggers

R references for handling Big data

[This article was first published on Maximize Productivity with Industrial Engineer and Operations Research Tools, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
The Dallas R User Group had a meeting over the weekend.  One of the discussions is the memory limitations with R.  This is a common subject among the R community and R User Groups.  There has been a lot of strides recently in allowing R to stretch its memory limitations.  I thought I would compile and share some of the best resources I have found to remedy the big data issue.

CRAN Packages
ff
This package allocates hard disk space to big data vectors.

bigmemory
This package allocates points to unused memory or points to a swap file.

Blog Articles
Taking R to the Limit:  Parallelism and Big Data

Hitting the Big DataCeiling Limit in R
While this is not a helpful article for big data it does show some of the issues R current faces.  Namely the issue of that lack of a “int64” or Long Long data type memory allocation.

Enterprise Software
Revolution R Enterprise
Revolution Analytics is creating enterprise software around R to tackle issues of big data, parallelism and threaded computing in order to speed up large data processing and analytics.

To leave a comment for the author, please follow the link and comment on their blog: Maximize Productivity with Industrial Engineer and Operations Research Tools.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.