Site icon R-bloggers

Analyzing accelerometer data with R

[This article was first published on Revolutions, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Using your smartphone (any modern phone with a built-in accelerometer should work), visit the Cast Your Spell page created by Nick Strayer. (If you need to type it to your phone browser directly, here's a shortlink: bit.ly/castspell .) Scroll down and click the "Press To Cast!" button, and then wave your phone like a wand using one of the shapes shown.

The app will attempt to detect which of the four "spells" you gestured. It was pretty confident in its detection when I cast "Incendio", but your mileage may vary depending on your wizarding ability and the underlying categorization model. 

Nick Strayer described how he built this application in a presentation at Data Day Texas last month. The app itself was built using Shiny with the shinysense package (on Github) to collect movement data from the phone. Nick trained a convolutional neural network model (from his own casting gesture data) using the keras package to classify gestures into one of the four "spells". (Interesting side note: because CNNs aren't time-dependent, you can gesture in reverse and still pass the classification test.)

It's almost like magic! For the complete details on how the Cast your Spell app was constructed, see Nick Strayer's presentation at the link below.

Data Day Texas 2018: Making Magic with Keras and Shiny (Nick Strayer)

To leave a comment for the author, please follow the link and comment on their blog: Revolutions.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.