Using the google prediction API from R

[This article was first published on Modern Tool Making, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Google has a “black box” prediction API that they provide for use with creating recommender systems or filtering spam. Furthermore, they provide an R package for interfacing that API, but try as I might I cannot get it to work under windows. Here are the instructions for setting up the API to run in R under linux. I haven’t tried this out yet, so let me know in the comments if it works, or if you can get it to run on Windows.

First we have to setup the Google Prediction API, as well as some dependencies:
1. Go to the Google APIs Console. This is your home base for managing google APIs.
2. In the upper left hand corner of the website (under the Google APIs logo) is a dropdown menu.  Use this to create a new project, called something informative like “R predictions.”
3. Activate the Google storage API and turn it on. Activating may require opening a new page.
4. Activate the Google prediction API and turn it on.  Activating may require opening a new page.
5. Click on the “Billing” tab, and make sure billing is enabled.  You may have to enter your billing information.  Note that you get 5GB of free storage through the end of 2011, and there’s a free quota on the prediction API for 5MB trained per day and 100 predictions per day, up to 20,000 total predictions.
6. Click the “Google Storage” tab, and make a note of the “x-goog-project-id.” You will need this when installing GSUtils.

Next we have to install some software on our computer to enable communication between R and the prediction API:
1. Install python, if you do not already have it.
2. Make sure you can run python from the command prompt.  You may need to add python to your “path” or “environment” variables to do this.  On windows, run the command prompt as administrator.
3. Install the R packages rjson and RCurl using install.packages() in R.
4. Make sure you can open .tar archives.  This is no problem on Mac/Linux systems, but on windows you need 7zip.
5. Download GSUtil, and follow the directions to install it on your system. This is the tricky part.
6. When you run GSUtil for the first time, make sure to use the following command: python gsutil config -b to allow gsutil to open a web page and authorize access to your google storage account.
6. When prompted, enter the project ID you recorded in part 1.
7. Download the googlepredictionapi package.
8. Open R, and setwd() to the folder containing the downloaded package.
9. Install the R package from source using this command:

Now we’re all set to start using the prediction API:
1. First we need to create a bucket to store our data.  Do this from the Google Storage Web Console. Name your bucket something useful, like rdata.  Don’t use capital letters or symbols.
2. Run the following script to test that everything works. Note that you have to save your data frame as a .csv file before GSUtil can upload it to google storage for modeling:

Good luck!  Here are some links for future reference:
1. Google directions for installing the googlepredictionapi package in R
2. Google directions for installing gsutil
3. Google API console for managing APIs and billing
4. Google storage console for managing buckets
5. Google APIs overview/introduction


To leave a comment for the author, please follow the link and comment on their blog: Modern Tool Making.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)