Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

I have been experimenting recently with using my relatively inexpensive camera to capture the feeling or sense of a place I am in, with an eye towards collecting a record of field sites in my ecological work. The key is for it to be more immersive than a simple photograph, so you can more easily recall the feeling of being there. There are several good reasons one might want to do this

• To have something to look back on to refresh your own memory of a field site when it starts to fade, usually well before you actually start writing up the results of your project.
• When you’ve been sitting at your computer so long you begin to lose the connection with the biology, immersing yourself into images of your field sites can revive the feeling of being there, which can revive the passion and motivation it inspires.
• Scientific outreach can be greatly aided by helping other people understand what it was like where you worked, especially if the place you work is somewhere few people get to experience.
• It is so much better to show interested colleagues or collaborators where you worked (and what you did), rather than trying to describe it to them

To really get a sense of a field-site, a great thing to do is get a 360 degree panoramic view. This was once only possible with very time-consuming technique and fancy equipment (at the very least a tripod). But these days, you can do a pretty good job with just a cheap camera and little more technique than it takes to do a decent selfie. So if you don’t have $60,000 to spare to buy an Elphel Eyesis (the camera Google uses to make take their streetview photos), though that would be amazing, you can still get a decent 360 degree photo. The key is in the advances that have been made in software over the last few years. It is now possible to take a series handheld pictures, standing in one place and shooting in all directions (including upwards and downwards) and have the software automatically detect where the photos overlap, then stitch them together. In the past, you had to be very careful with the centre of rotation when taking the photos, to reduce parallax, and the photos had to overlap by sometimes more than 50%. Now software can deal with slightly different viewpoints and much smaller overlap (I’d recommend trying to get 15% overlap). Also, it helps to rotate around the axis of your lens when taking photos in all directions, rather than around the axis of your feet, which reduces parallax, but it doesn’t have to be perfect. I have been using Autopano Giga for my stitching. It is excellent but not free (I have access to it through my work), so I plan on investigating the free open source alternative Hugin. I may post a tutorial on that at a later date. ## Ways to display 360 degree images Once you have stitched together your images, there is the question of how do you display all 360 degrees at once. In this post I will talk about using a special projection known as ‘Little Planet’ to make a striking image that allows you to see all directions at once. In Part 2 of this blog series I will show how it is possible to covert a 360 degree image into a Google streetview style interactive immersive photosphere. This perhaps provides the greatest opportunity to feel like you are back in a place, but it takes a few more steps. In the mean time, you can produce a Little Planet projection. Most good panoramic software should be able to produce a Little Planet image. The way it works is that it projects and warps the panorama so that it wraps around and meets with itself again. This creates a ball-shaped image, where the view-point of the photographer is in the centre such that the 360 degree view falls away in all directions as your eye moves out from the centre. To get a sense of it, here is some examples. Here are two Little Planets I made of local parks here in Perth, Western Australia. If you try and make one of these in a software package, you will likely find that the full sphere is not entirely complete. You might be missing some of your sky, or you may be able to see your dissembodied feet in the centre of the Little Planet. In most cases you will need to do a bit of touch up to fix these issues. If you have photoshop, content-aware delete is amazing for automatically filling in the background texture over some errant photographer boots. But if you can’t afford photoshop, the free, open-source software GIMP also has an impressive inpainting tool that seems like magic. I really wish I had come across these technique before I did my month-long field-work in the Kimberley region of Australia (the banner image at the top of this post is from that trip). It is so remote and so rarely visited by outsiders it would have been fantastic to have some more immersive images of the place (plus it is so remote that I may never make it back—I was lucky to go once.) As it was, we took a lot of video, which is also great. The trouble with video is that it takes extensive editing to make it watchable, and even then, watching it is a time investment. The advantage of Little Planet images is that you can bring them up at a moment’s notice if you just need a quick ‘hit’ of place’s essence. Plus you can print them out and put them on your wall. Just for fun, we can even use these little planets as a cool backdrop for a circular phylogeny. Here’s how you can do it in R. After cropping and lightening my Little Planet pic: library(jpeg) library(treebase) library(ape) library(wesanderson) ## load little planet image backimg <- readJPEG("Little-Planets/HerdsmanLake_edit_small_centred4.jpg") ## grab a tree for waterfowl tree <- search_treebase("3269","id.tree","tree",TRUE) ## http://purl.org/phylo/treebase/phylows/tree/find?query=tb.identifier.tree==3269&format=rss1&recordSchema=tree ## Query resolved, looking at each matching resource... ## 1 resources found matching query ## Attempting try 1 ## Looking for nexus files... ## Tree read in successfully ## dropped 0 objects par(bg="gray") ## setup the tree plot tt<-plot(tree[[1]], type="fan", plot=FALSE, no.margin=TRUE) ## plot the background image on the blank plot rasterImage(backimg, tt$x.lim[1], tt$y.lim[1], tt$x.lim[2], tt\$y.lim[2])
## draw tree with thick green branches
plot.phylo.add(tree[[1]], type="fan",cex=0.75, edge.color=wes.palette(4, "Rushmore")[3],edge.width=6, label.offset=0.1, node.depth=2)
## redraw tree with thinner white branches, to give an outline effect


The phylogeny I grabbed of TreeBase and is from the paper: Livezey, B. C. (1996). A phylogenetic analysis of geese and swans (Anseriformes: Anserinae), including selected fossil species. Systematic Biology, 45(4), 415-450.

This could probably use some tweaking in a graphics program. As it is, it looks a bit busy. But it was fun to try this in R. By the way, in case you are wondering, the function plot.phylo.add() is just the plot.phylo() function from the package ape, slightly modified so it will plot over something that has already been plotted. I simply commented out this line:

plot.default(0, type = "n", xlim = x.lim, ylim = y.lim, xlab = "",
ylab = "", axes = FALSE, asp = asp, ...)


That is the line where the function sets up the plot before it draws the tree. Since the plot had already been setup, this line isn't needed, and if it is still there, it results in the plot being redrawn and losing the background image.

What would make this plot even cooler would be to put little silouette icons of the species at the tips. I will work on that.

In part Two I will show how to turn 360 degree panorama images into an interactive 3D environment using Google Views like this one here:

Now, wouldn't you love to have one of these for your field sites?