The O'Reilly Radar blog has a lengthy and very interesting interview with the lead and deputy CIOs of the Consumer Financial Protection Bureau, the new US government agency devoted to consumer protections in the financial markets. In that interview, they talk about the many open-source tools used in the agency (and the parent Treasury Department): Linux, WordPress, Splunk, Django, Git and yes — R for data analysis with Big Data. (The CFPB has also made public its policy on use of open-source software.) They reveal an interesting cultural shift related to the use of R in this question at the end of the interview:
How do you see open source contributing to your ability to get insights from large amounts of data? If you're recruiting developers, can they actually make a difference in helping their fellow citizens?
[…] R is an interesting example. What we're finding is that as more people are coming out of academia into the professional world, they're actually used to using R in school. And then they have to come out and learn a different tool and they're actually working in the marketplace.
It's similar with the Mac versus the PC. You get people using the Mac in college — and suddenly they have to go to a Windows interface. Why impose that on them? If they're going to be extremely productive with a tool like R, why not allow that to be used?
Here at Revolution Analytics, we're hearing much the same story from commercial organizations who have data analysis platforms build around legacy platforms like SAS. When all of the new recruits coming out of school are already trained in R, the choice is between spending a year training the up in SAS (and incurring both the expense and the opportunity cost of a year's work in the process), or setting up a platform based around a commercially-supported R distribution. And more and more often, that second choice is the more compelling one.
O'Reilly Radar: Open source is interoperable with smarter government at the CFPB