The role of the data analyst is changing. The best analysts know that different tasks require different tools.
The End of Bundling
Once upon a time, data purchases came bundled with data consumption software. You couldn’t just buy a dataset; you had to install a proprietary data downloader. You couldn’t just view the data on your computer; you needed a custom visualizer, or even a full-fledged data terminal. And you couldn’t do whatever you wanted with the data; you were forced to work within the applications provided to you by the vendor.
Today, this seems absurd. In 2016, analysts don’t want to be restricted in the way they work. Different analysts have different toolkits, preferences, workflows. A small team may use Matlab for initial exploration, R for analysis, Excel for prototyping, Python for production code and Tableau for visualization – all in the course of a single project.
The Changing Role of the Data Analyst
Today’s analysts are multiskilled because their role has evolved. They need to be excellent data cleaners and organizers. They need to specify computational models and extract statistical insights. They need to optimize their code, building systems that can handle large volumes of data at high speeds. They need to visualize their results. They need to present their findings and connect them to business outcomes. And each of these tasks requires different tools.
Expecting analysts to manage their entire workflow within the confines of a single tool, no matter how powerful, is unrealistic. In fact, it’s counterproductive. Every tool has strengths and weaknesses. Some tools are right for certain jobs and wrong for others. Saying “one-size-fits-all” is a recipe for disaster.
The best analysts know this. And that’s why they are comfortable with a variety of tools and applications. Today’s best analysts know that sometimes you need R, and sometimes you need Python. Sometimes you crunch numbers in Matlab, and sometimes you use Mathematica. And knowing when to use each tool is a valuable skill unto itself.
The Forced Evolution of Financial Data Providers
Unfortunately, the financial data industry hasn’t always kept pace with these changing workflow patterns. By bundling data and delivery, traditional providers place crippling restraints on what analysts can do with the data they’ve purchased. It’s their way or the highway.
But a new wave of data providers is making it easier for analysts to work on data in the tool of their choice. For these providers, the concept of “Bring Your Own Tool” is not something to fear. It’s a way to deliver even more value to their users. By empowering their users with multiple tools, they increase the value their users can extract from the data – which means they increase the value of the data itself. It’s a win-win.
The Quandl Paradigm
Quandl embraces this shift. We make it easy for users to get the data they need, in the form they want – whatever that form is. Need data as a CSV, or JSON, or XML download? That’s a click away. Want to pipe data into Python, R, Excel, Maple or Matlab? We have built-in integrations for these, and 20 other tools. Want to view data in Tiingo or Money.Net, or pick stocks using WooTrader, or chart trends on TradingView or Plotly? Easy. Want direct access to the raw data via an API? We provide that as well.
Our goal is to empower the analyst: to take her from needing data, to having that data in the tool of her choice, in a matter of seconds. That’s the best way to unlock the true value of data, and that’s why we built Quandl.
What’s your go-to toolkit for data analysis? Share your recommendations below!
The post Bring Your Own Tool: the Latest Trend in Data Analysis appeared first on Quandl Blog.