R Package to Download Fitbit Data

January 22, 2015
By

(This article was first published on Stats and things, and kindly contributed to R-bloggers)

Fitbit is a device that tracks your daily activity. It's basically a pedometer, but it does a little more. It has an altimeter, so it can count flight of stairs climbed. It can detect your sleeping activity and give you a read on how often you are tossing and turning. Any how, I received a Fitbit One for Christmas and have been unexpectedly into this thing. I absolutely did not expect to be motivated by monitoring my fitness activity. It's been fun.

The problem is that Fitbit doesn't allow you to export your data under their “free” data tracking tier. It costs $50 a month to upgrade to “premium” to allow for data export. It's my data, I should be able to get it. So, I made an R package to do just that.

fitbitScraper is my new R package, hosted on Github for now, to download Fitbit data. It uses the internal API that fitbit uses to populate your dashboard. Here, they give data as granular as every 15 minutes. Also, there's daily aggregate data available. Basically, the package logs into fibit.com using your email / password, and is returned a cookie. Using that cookie, the internal API is accessible. For the time being, I can only get that cookie if you use email / password to login. Facebook and Google login are available, but I haven't put any effort into figuring out how to get a cookie if one of those methods are used to login. One could copy and paste their cookie information into R and use this package that way…

Here's a quick example of downloading and graphing step data for a given day. On this particular day, I played basketball from about 6am until 9am.

devtools::install_github("corynissen/fitbitScraper")  
library("fitbitScraper")
# just reading from file to hide pw and to make .Rmd document to work...
mypassword <- readLines("pw.txt")
cookie <- login(email="[email protected]", password=mypassword)
df <- get_15_min_data(cookie, what="steps", date="2015-01-10")
library("ggplot2")
ggplot(df) + geom_bar(aes(x=time, y=data, fill=data), stat="identity") +
xlab("") +ylab("steps") +
theme(axis.ticks.x=element_blank(),
panel.grid.major.x = element_blank(),
panel.grid.minor.x = element_blank(),
panel.grid.minor.y = element_blank(),
panel.background=element_blank(),
panel.grid.major.y=element_line(colour="gray", size=.1),
legend.position="none")

plot of chunk unnamed-chunk-1

Here's an example of the daily data.

df <- get_daily_data(cookie, what="steps", start_date="2015-01-13", end_date="2015-01-20")  
library("ggplot2")
ggplot(df) + geom_bar(aes(x=time, y=data), stat="identity") +
xlab("") +ylab("steps") +
theme(axis.ticks.x=element_blank(),
panel.grid.major.x = element_blank(),
panel.grid.minor.x = element_blank(),
panel.grid.minor.y = element_blank(),
panel.background=element_blank(),
panel.grid.major.y=element_line(colour="gray", size=.1),
legend.position="none")

plot of chunk unnamed-chunk-2

I have to give a should out to this guy for inspiration. I don't know perl, so I couldn't really use anything here, but it gave me confidence that I could figure it out.

I hope this is of some use to some of you R hackers out there…

To leave a comment for the author, please follow the link and comment on their blog: Stats and things.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...



If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Comments are closed.

Search R-bloggers


Sponsors

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)