Tutorial: Azure Data Lake analytics with R

October 11, 2017
By

(This article was first published on Revolutions, and kindly contributed to R-bloggers)

The Azure Data Lake store is an Apache Hadoop file system compatible with HDFS, hosted and managed in the Azure Cloud. You can store and access the data within directly via the API, by connecting the filesystem directly to Azure HDInsight services, or via HDFS-compatible open-source applications. And for data science applications, you can also access the data directly from R, as this tutorial explains

To interface with Azure Data Lake, you'll use U-SQL, a SQL-like language extensible using C#. The R Extensions for U-SQL allow you to reference an R script from a U-SQL statement, and pass data from Data Lake into the R Script. There's a 500Mb limit for the data passed to R, but the basic idea is that you perform the main data munging tasks in U-SQL, and then pass the prepared data to R for analysis. With this data you can use any function from base R or any R package. (Several common R packages are provided in the environment, or you can upload and install other packages directly, or use the checkpoint package to install everything you need.) The R engine used is R 3.2.2.

The tutorial walks you through the following stages:

  • Exercise 1 – Inline R code.
  • Exercise 2 – Deploy R script as resource.
  • Exercise 3 – Check the R environment in ADLA. Install the magrittr package.
  • Exercise 4 – Using checkpoint to create a zip file containing required packages and all the dependencies. Demonstrate how to deploy and use dplyr package.
  • Exercise 5 – Understanding how partitioning works in ADLA. Specifically the REDUCE operation.
  • Exercise 6 – About rReturnType as “charactermatrix”.
  • Exercise 7 – Save the trained model. Deploy trained model for scoring.
  • Exercise 8 – Use R scripts as combiners (using COMBINE and Extension.R.Combiner).
  • Exercise 9 – Use rxDTree() from RevoScaleR package.
  • Advanced – More complex examples.

To get started with the tutorial, check the link below for the prerequisites and then hop on over to Github for the complete walkthrough. You can also follow along with this Jupyter notebook.

Cortana Intelligence Gallery: Getting Started with using Azure Data Lake Analytics with R – A Tutorial

To leave a comment for the author, please follow the link and comment on their blog: Revolutions.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...



If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Comments are closed.

Search R-bloggers

Sponsors

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)