Handling Large CSV Files in R

[This article was first published on Quantitative Finance Collector, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

A follow-up of my previous post Excellent Free CSV Splitter. I asked a question at LinkedIn about how to handle large CSV files in R / Matlab. Specifically,
Quotation
suppose I have a large CSV file with over 30 million number of rows, both Matlab / R lacks memory when importing the data. Could you share your way to handle this issue? what I am thinking is:
a) split the file into several pieces (free, straightforward but hard to maintain);
b) use MS SQL/MySQL (have to learn it, MS SQL isn’t free, not straightforward).


A useful summary of suggested solution:
1,  1) import the large file via “scan” in R;
2) convert to a data.frame –> to keep data formats
3) use cast –> to group data in the most “square” format as possible, this step involves the Reshape package, a very good one.

2, use the bigmemory package to load the data, so in my case, using read.big.matrix() instead of read.table(). There are several other interesting functions in this package, such as mwhich() replacing which() for memory consideration, foreach() instead of for(), etc. How large can this package handle? I don’t know, the authors successfully load a CSV with size as large as 11GB.

3, switch to a 64 bit version of R with enough memory and preferably on linux. I can’t test this solution at my office due to administration constraint, although it is doable, as mentioned in R help document,
Quotation
64-bit versions of Windows run 32-bit executables under the WOW (Windows on Windows) subsystem: they run in almost exactly the same way as on a 32-bit version of Windows, except that the address limit for the R process is 4GB (rather than 2GB or perhaps 3GB)….The disadvantages are that all the pointers are 8 rather than 4 bytes and so small objects are larger and more data has to be moved around, and that far less external software is available for 64-bit versions of the OS.


Search & trial.

Tags – r , csv
Read the full post at Handling Large CSV Files in R.

To leave a comment for the author, please follow the link and comment on their blog: Quantitative Finance Collector.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)