(This article was first published on

**Econometrics Beat: Dave Giles' Blog**, and kindly contributed to R-bloggers)“Big Data” = data that come in amounts that are too large for current computer hardware and software to deal with. That sounds like fun!

Norman Nie developed the well known SPSS statistical package in the 1960, and is currently President and CEO of **Revolution Analytics**, a California company that promotes the use of the **R** computing environment for handling complex data analysis problems.

This

**recent piece**, based on an interview with Nie, makes some interesting points:-
“….parallelized software running on inexpensive multiprocess computers is the wave of the future for all types of big data computing. But the transition will be slow.” [
*Yep – that’s what a lot of us are doing now in our simulation work*– DG] -
“The US is currently experiencing an acute shortage of mathematicians and others trained in related fields such as statistics.” [
*And not just the U.S.*– DG] -
“Data analytics requires knowledge in multiple fields. For instance, a math major might need some familiarity with social sciences….. [
*Such as economics*– DG]. And candidates with degrees in the social sciences often lack sufficient math training.” [*Yep – math. and statistics*– DG]

Keep an eye on Revolution Analytics through their blog,

**Revolutions**.© 2011, David E. Giles

To

**leave a comment**for the author, please follow the link and comment on their blog:**Econometrics Beat: Dave Giles' Blog**.R-bloggers.com offers

**daily e-mail updates**about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...