Erik Sigur, Information Technologist for the Department of Statistics and Probability at Michigan State University, writes at ReadWriteWeb about using Revolution R Enterprise to provide high-performance computation in R to the researchers in his department:
Our search for a more effective version of R ultimately brought us to a product called Revolution R Enterprise by Revolution Analytics, which provides commercial support and software for open source R. It takes advantage of multiple processor cores by using optimized assembly code and efficient multi-threaded algorithms that use all of the processor cores simultaneously.
The department at MSU provides high-performance computing facilities via their Statistical Computing Cluster: a network of high-performance PCs running under the Microsoft HPC Server environment. Revolution R Enterprise 5 provides features to distribute R jobs amongst the nodes of a cluster, and to use the power of distributed computing to reduce the time required to process big-data statistical analyses. Says Erik:
Once the department could schedule R jobs in an HPC environment, the demand began to drastically increase. The HPC cluster is now scheduling more than four times the amount of jobs that were scheduled in previous semesters, from 200 jobs over a year ago to over 800 jobs this past semester. Jobs that were taking over three months to complete on open source R were completed in less than a few days with Revolution R. Computational jobs are now run multiple times with significantly higher levels of accuracy than ever before.
Learn more about Erik's experiences with Revolution R Enterprise at the link below. You can also read the case study on the Revolution Analytics website, or learn more about the HPC features of Revolution R Enterprise in our archived webinar.
ReadWriteEnterprise: [Case Study] Lessons in High Performance Computing with Open Source