Tutorial: Azure Data Lake analytics with R

[This article was first published on Revolutions, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

The Azure Data Lake store is an Apache Hadoop file system compatible with HDFS, hosted and managed in the Azure Cloud. You can store and access the data within directly via the API, by connecting the filesystem directly to Azure HDInsight services, or via HDFS-compatible open-source applications. And for data science applications, you can also access the data directly from R, as this tutorial explains

To interface with Azure Data Lake, you'll use U-SQL, a SQL-like language extensible using C#. The R Extensions for U-SQL allow you to reference an R script from a U-SQL statement, and pass data from Data Lake into the R Script. There's a 500Mb limit for the data passed to R, but the basic idea is that you perform the main data munging tasks in U-SQL, and then pass the prepared data to R for analysis. With this data you can use any function from base R or any R package. (Several common R packages are provided in the environment, or you can upload and install other packages directly, or use the checkpoint package to install everything you need.) The R engine used is R 3.2.2.

The tutorial walks you through the following stages:

  • Exercise 1 – Inline R code.
  • Exercise 2 – Deploy R script as resource.
  • Exercise 3 – Check the R environment in ADLA. Install the magrittr package.
  • Exercise 4 – Using checkpoint to create a zip file containing required packages and all the dependencies. Demonstrate how to deploy and use dplyr package.
  • Exercise 5 – Understanding how partitioning works in ADLA. Specifically the REDUCE operation.
  • Exercise 6 – About rReturnType as “charactermatrix”.
  • Exercise 7 – Save the trained model. Deploy trained model for scoring.
  • Exercise 8 – Use R scripts as combiners (using COMBINE and Extension.R.Combiner).
  • Exercise 9 – Use rxDTree() from RevoScaleR package.
  • Advanced – More complex examples.

To get started with the tutorial, check the link below for the prerequisites and then hop on over to Github for the complete walkthrough. You can also follow along with this Jupyter notebook.

Cortana Intelligence Gallery: Getting Started with using Azure Data Lake Analytics with R – A Tutorial

To leave a comment for the author, please follow the link and comment on their blog: Revolutions.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)