Spatial data extraction around buffered points in R

November 8, 2014
By

(This article was first published on Ecology in silico, and kindly contributed to R-bloggers)

Quantifying spatial data (e.g. land cover) around points can be done in a variety of ways, some of which require considerable amounts of patience, clicking around, and/or cash for a license.
Here’s a bit of code that I cobbled together to quickly extract land cover data from the National Land Cover Database for buffered regions around points (e.g. small ponds, point-count locations, etc.), in this case, U.S. capitals.

Here, extract_cover() is a function to do the extraction (with the help of the raster package), and extraction.R makes a parallel call to the function using the doMC package:

extract_cover.R
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
# Function to extract the data in R instead of Arc
# inputs: year, buffer size (in meters), point data file, 
#   cover data directory, and output file directory
# output: csv file with site id, cover type, and % in buffer

extract_cover <- function(year, buffer,
                          point_d = "sites.csv",
                          data_dir="NLCD_data",
                          write_dir="extracted"){
  require(raster)
  require(rgdal)
  require(stringr)

  # load cover data 
  filename <- paste(data_dir, "/nlcd_",
                    year,
                    "_landcover_2011_edition_2014_03_31.img",
                    sep="")
  NLCD <- raster(filename)

  # load site data
  sites <- read.csv(point_d, header=T)
  coords <- sites[, c("longitude", "latitude")]

  #convert lat/lon to appropriate projection
  names(coords) <- c("x", "y")
  coordinates(coords) <- ~x + y
  proj4string(coords) <- CRS("+proj=longlat +ellps=WGS84 +datum=WGS84")
  crs_args <- NLCD@crs@projargs
  sites_transformed <- spTransform(coords, CRS(crs_args))

  #extract land cover data for each point, given buffer size
  Landcover <- extract(NLCD, sites_transformed, buffer=buffer)

  # summarize each site's data by proportion of each cover type
  summ <- lapply(Landcover, function(x){
    prop.table(table(x))
  }
  )

  # generate land cover number to name conversions
  num.codes <- unique(unlist(Landcover))
  cover.names <- NLCD@data@attributes[[1]]$Land.Cover.Class[num.codes + 1]
  levels(cover.names)[1] <- NA # first level is ""
  conversions <- data.frame(num.codes, cover.names)
  conversions <- na.omit(conversions)
  conversions <- conversions[order(conversions$num.codes),]

  # convert to data frame
  mydf <- data.frame(id = rep(sites$id, lapply(summ, length)),
                     cover = names(unlist(summ)),
                     percent = unlist(summ)
  )

  # create cover name column
  mydf$cover2 <- mydf$cover
  levels(mydf$cover2) <- conversions$cover.names

  # write output
  out_name <- paste(write_dir, "/",
                    year, "_cover_", buffer,
                    "_m_buffer.csv", sep="")
  write.csv(mydf, out_name)
}

extraction.R
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
# Script to actually extract the data at various buffer distances
# parallelized for speed (one core per year)
library(doMC)

years <- c(2001, 2006, 2011)
nyears <- length(years)
registerDoMC(nyears)

# input vector of distances (in meters)
buffer_distances <- c(1000)

foreach (i=1:nyears) %dopar% {
  for (j in buffer_distances){
    extract_cover(year = years[i], buffer=j)
  }
}

Resources

To leave a comment for the author, please follow the link and comment on their blog: Ecology in silico.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...



If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Comments are closed.

Search R-bloggers


Sponsors

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)