Entering and Exiting 2018

[This article was first published on Data Imaginist, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

The year is nearly over and it is the time for reflection and navel-gazing. I
don’t have incredibly profound things to say, but a lot of things happened in
2018 and this is as good a time as any to go through it all…

Picking Myself Up

The prospects of my “2017 in review”
post were not particularly rosy… I had hit somewhat of a burnout in terms of
programming, but was none the less positive and had a great job and a lot of
positive feedback on patchwork.
Further, I had RStudio::conf to look forward to, which would be my first IRL
head-to-head with the R community at large. I had also promised to present a
fully-fledged tidy approach to network analysis and while both
ggraph and
tidygraph had already been released
there were things I wanted to develop prior to presenting it. All-in-all there
was a great impediment to pick myself up and get on with developing (not arguing
that this is a fail-safe way to deal with burnout by the way).


My trip to San Diego was amazing. If you ever get to go to an RStudio conference
I don’t think you will be disappointed (full disclosure and spoiler-alert: I now
work for RStudio). My suspicion that the R community is as amazing in real life
as on Twitter was confirmed and it was great to finally get to see all those
people I admire and look up to. My talk went fairly well I think — I haven’t
watched the recordings as I don’t particularly enjoy watching myself talk,
but you can,
if you are so inclined. At the conference I got to chat a bit with Jenny Bryan
(one of the admire/look-up-to people referenced above) and we discussed what we
were going to talk about in our respective keynotes at useR in Brisbane in the
summer. I half-jokingly said that I might talk about gganimate
because that would give me the required push to actually begin developing it…

Talk-Driven Development

Around April Dianne Cook was getting pushy with getting at least a talk title
for my keynote, and at that point I had already imagined a couple of slides on
gganimate and thought “to heck-with-it” and responded with the daunting title of
The Grammar of Animation. At that point I had still not written a single line
of code for gganimate, and knew that tweenr
would need a serious update to support what I had in mind. In addition, I knew
I had to develop what ended up as transformr
before I could begin with gganimate proper. All-in-all my talk title could not
be more stress-inducing…

Thankfully I had a pretty clear vision in my head (which was also why I wanted
to talk about it) so the motivation was there to drag me along for the ride.
Another great benefit of developing tools for data visualisation in general and
animation in particular, is that it sets Twitter on fire. After getting tweenr
and transformr into a shape sufficient to support gganimate, I began to create
the backbone of the package, and once I shared the first animation created with
it, it was clear that I was in the pursuit of something that resonated with a
lot of people.

To my great surprise I was able to get gganimate to a state where it actually
supported the main grammar I had in mind prior useR, and I could begin to make
the presentation I had in mind:

useR was a great experience, not only because I was able to give the talk I had
hoped for, but also due to the greatness of the organisers and the attendees. I
was able to get to meet a lot of the members of R Core for the first time and
they were very supportive of my quest to improve the performance of the R
graphic stack (last slide of my talk), so I had high hopes that this might be
achievable within the next 5-10 years (it is no small task). I had been
surprised about the support for my ideas about animations and their relevance
within the R community, so in general the conference left my invigorated and
with the stamina to complete gganimate.


I managed to release a couple of packages that do not fit into the narrative I’m
trying to create for this year, but they deserve a mention none the less.

In the beginning of the year I was able to finish of
particles, a port and extension of the
d3-force algorithm developed by Mike Bostock. It can be used for both great fun
and work and did among other things result in this beautiful pixel-decomposition
of Hadley:

While making improvements to tweenr in anticipation of gganimate it became clear
that colour conversion was a main bottleneck and I ended up developing
farver to improve on this. Beyond very
fast colour conversion it also allow a range of different colour distance
calculations to be performed. Some of the discussion that followed the
development of ths package led to Brodie Gaslam improving the colour conversion
performance in base R and while it is not as fast as farver, it is pretty close
and future versions of R will definetly benefit from his great contribution.

I haven’t had much time to make generative art this year, but I did manage to
find time for some infrastructure work that will support my endavours in this
space in the future. The ambient package
is able to produce all sorts of multidimensional noise in a very performant way
due to the speed of the underlying C++ library. I’m planning to expand on this
package quite a bit when I get the time as I have lots of cool ideas for how to
manipulate noise in a tidy manner.

How you use colours in data visualisation is extremely important, which is also
why the data visualisation community has embraced the viridis colour scale to
the extend that they have. I’ve personally grown tired of the aesthetic though,
so when I saw a range of perceptualy uniform palettes developed by Fabio Crameri
was quick to bring it to R with the scico
package. To my surprise the development of a colour palette packages became my
most contentious contribution this year (that I know of), so I welcome everyone
who is tired of colour palette packages to ignore it alltogether.


Prior to useR I had began to receives some cryptic questions from Hadley and it
was clear that he was either trolling my or that something was brewing. During
the late summer it became clear that it was the latter (thankfully), as RStudio
wanted me to work full time on improving the R graphic stack. Working for
RStudio on something so aligned with my own interest is beyond what I had hoped
for, so despite my joy in working for the danish tax authorities the switch was
a no-brainer. I wish my former office all the best — they are doing incredible
work – and look forward to seeing some of them at RStudio conf in Austin later
in the month.

Being part of the tidyverse team has so far been a great experience. I’ve been
lucky enough to meet several of them already as part of the different
conferences I attended this year, so working remotely with them doesn’t feel
that strange. It can be intimidating to work with such a talented team, but if
that is the least of my concerns I’m pretty sure I can manage that.

I look forward to share the performance improvements I’m making with all of you
throughout the coming years, and hopefully I’ll have time to also improve on
some of my packages that has received less attention during the development of

Happy New Year!

To leave a comment for the author, please follow the link and comment on their blog: Data Imaginist.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)