Aggregating Results from Unreliable Functions in R

May 12, 2006

(This article was first published on "R-bloggers" via Tal Galili in Google Reader, and kindly contributed to R-bloggers)

I posted this as a response to a question on R-help. I think the idea of a “collect” function could be useful both in the context of unreliable functions that sometimes error out and also in filtering contexts where currently one creates a list containing good elements and some sort of sentinel, usually NULL, which has itself to be filtered out in a separate subsetting operation after the main filtering loop.

Here’s an example:

    d <- runif(20, min=-2, max=8) # test data

    aFunc <- function(x) {  # gives error occasionally
        if (x > 0)
          stop("encountered bad x")

collect <- function(x, FUN, skip_error=TRUE, args_list=NULL)
    if (!is.vector(x))
      stop("arg x must be a vector")
    fname <- deparse(substitute(FUN))
    xvar <- deparse(substitute(x))
    i <- 1
    j <- 1
    result <- vector(mode=mode(x), length=length(x))
    while (i <= length(x)) {
            args <- list(x[i])
            if (length(args_list))
              args <- c(args, args_list)
            ans <-, args)
            result[j] <- ans
            j <- j + 1
        }, error=function(e) {
            if (!skip_error) {
                msg <- paste("collect\n",
                             "call to", fname, "failed at",
                             paste(xvar, "[",  i, "]\n", sep=""),
                             "Message:\n", conditionMessage(e))
                stop(msg, call.=FALSE)
                 finally={i <- i + 1})
    if (j > 1)
      vector(mode=mode(x), length=0)

## Example

collect(d, aFunc, skip_error=FALSE)
Error: collect
 call to aFunc failed at d[2]
 encountered bad x

collect(d, aFunc, skip_error=TRUE)
 [1] 7.7380303 0.7554328 1.8352623 0.5136118 4.4231091 2.5368103 1.8656615
 [8] 2.9244200 2.1364120 7.6711189 0.2141325 7.8216620 5.8347576 5.3939892

To leave a comment for the author, please follow the link and comment on their blog: "R-bloggers" via Tal Galili in Google Reader. offers daily e-mail updates about R news and tutorials on topics such as: visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...

If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...


Comments are closed.