RStudio:addins part 4 – Unit testing coverage investigation and improvement, made easy

[This article was first published on Jozef's Rblog, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Introduction

A developer always pays his technical debts! And we have a debt to pay to the gods of coding best practices, as we did not present many unit tests for our functions yet. Today we will show how to efficiently investigate and improve unit test coverage for our R code, with focus on functions governing our RStudio addins, which have their own specifics.

As a practical example, we will do a simple resctructuring of one of our functions to increase its test coverage from a mere 34% to over 90%.

The pretty rewards for your tests

The pretty rewards for your tests

Fly-through of unit testing in R

Much has been written on the importance of unit testing, so we will not spend more time on convincing the readers, but rather very quickly provide a few references in case the reader is new to unit testing with R. In the later parts of the article we assume that these basics are known.

In a few words

  • devtools – Makes package development easier by providing R functions that simplify common tasks
  • testthat– Is the most popular unit testing package for R
  • covr– Helps track test coverage for R packages and view reports locally or (optionally) upload the results

For a start guide to use testthat within a package, visit the Testing section of R packages by Hadley Wickham. I would also recommend checking out the showcase on the 2.0.0 release of the testthat itself.

Investigating test coverage within a package

For the purpose of investigating the test coverage of a package we can use the covr package. Within an R project, we can call the package_coverage() function to get a nicely printed high-level overview, or we can provide a specific path to a package root directory and call it as follows:

# This looks much prettier in the R console ;)
covr::package_coverage(pkgPath)
## jhaddins Coverage: 59.05%
## R/viewSelection.R: 34.15%
## R/addRoxytag.R: 40.91%
## R/makeCmd.R: 92.86%

For a deeper investigation, converting the results to a data.frame might be very useful. The below shows the count of number of times that given expression was called during the running of our tests for each group of code lines:

covResults <- covr::package_coverage(pkgPath)
as.data.frame(covResults)[, c(1:3, 5, 11)]
##             filename         functions first_line last_line value
## 1     R/addRoxytag.R            roxyfy         10        12     6
## 2     R/addRoxytag.R            roxyfy         11        11     2
## 3     R/addRoxytag.R            roxyfy         13        15     4
## 4     R/addRoxytag.R            roxyfy         14        14     2
## 5     R/addRoxytag.R            roxyfy         16        16     2
## 6     R/addRoxytag.R            roxyfy         17        17     2
## 7     R/addRoxytag.R            roxyfy         18        18     2
## 8     R/addRoxytag.R        addRoxytag         29        29     0
## 9     R/addRoxytag.R        addRoxytag         30        37     0
## 10    R/addRoxytag.R        addRoxytag         32        34     0
## 11    R/addRoxytag.R        addRoxytag         38        38     0
## 12    R/addRoxytag.R    addRoxytagCode         44        44     0
## 13    R/addRoxytag.R    addRoxytagLink         50        50     0
## 14    R/addRoxytag.R     addRoxytagEqn         56        56     0
## 15       R/makeCmd.R           makeCmd         20        24     5
## 16       R/makeCmd.R           makeCmd         21        21     0
## 17       R/makeCmd.R           makeCmd         23        23     5
## 18       R/makeCmd.R           makeCmd         25        27     5
## 19       R/makeCmd.R           makeCmd         26        26     4
## 20       R/makeCmd.R           makeCmd         28        32     5
## 21       R/makeCmd.R           makeCmd         33        35     5
## 22       R/makeCmd.R           makeCmd         34        34     2
## 23       R/makeCmd.R           makeCmd         36        38     5
## 24       R/makeCmd.R           makeCmd         37        37     1
## 25       R/makeCmd.R           makeCmd         39        39     5
## 26       R/makeCmd.R      replaceTilde         48        50     1
## 27       R/makeCmd.R      replaceTilde         49        49     1
## 28       R/makeCmd.R      replaceTilde         51        51     1
## 29       R/makeCmd.R        executeCmd         61        61     5
## 30       R/makeCmd.R        executeCmd         62        66     5
## 31       R/makeCmd.R        executeCmd         68        72     3
## 32       R/makeCmd.R        executeCmd         69        69     0
## 33       R/makeCmd.R        executeCmd         71        71     3
## 34       R/makeCmd.R runCurrentRscript         90        90     1
## 35       R/makeCmd.R runCurrentRscript         91        91     1
## 36       R/makeCmd.R runCurrentRscript         92        96     1
## 37       R/makeCmd.R runCurrentRscript         93        95     1
## 38       R/makeCmd.R runCurrentRscript         94        94     0
## 39 R/viewSelection.R     viewSelection          7         7     0
## 40 R/viewSelection.R     viewSelection          8        12     0
## 41 R/viewSelection.R     viewSelection         10        10     0
## 42 R/viewSelection.R     viewSelection         13        13     0
## 43 R/viewSelection.R  getFromSysframes         24        24     6
## 44 R/viewSelection.R  getFromSysframes         25        25     3
## 45 R/viewSelection.R  getFromSysframes         26        26     3
## 46 R/viewSelection.R  getFromSysframes         28        28     3
## 47 R/viewSelection.R  getFromSysframes         29        29     3
## 48 R/viewSelection.R  getFromSysframes         30        30     3
## 49 R/viewSelection.R  getFromSysframes         31        31    92
## 50 R/viewSelection.R  getFromSysframes         32        32    92
## 51 R/viewSelection.R  getFromSysframes         33        33    92
## 52 R/viewSelection.R  getFromSysframes         34        34     2
## 53 R/viewSelection.R  getFromSysframes         37        37     1
## 54 R/viewSelection.R        viewObject         56        56     3
## 55 R/viewSelection.R        viewObject         57        57     3
## 56 R/viewSelection.R        viewObject         58        58     3
## 57 R/viewSelection.R        viewObject         61        61     0
## 58 R/viewSelection.R        viewObject         64        64     0
## 59 R/viewSelection.R        viewObject         65        65     0
## 60 R/viewSelection.R        viewObject         66        66     0
## 61 R/viewSelection.R        viewObject         69        69     0
## 62 R/viewSelection.R        viewObject         70        70     0
## 63 R/viewSelection.R        viewObject         71        71     0
## 64 R/viewSelection.R        viewObject         74        74     0
## 65 R/viewSelection.R        viewObject         76        76     0
## 66 R/viewSelection.R        viewObject         77        77     0
## 67 R/viewSelection.R        viewObject         79        79     0
## 68 R/viewSelection.R        viewObject         81        81     0
## 69 R/viewSelection.R        viewObject         82        82     0
## 70 R/viewSelection.R        viewObject         83        83     0
## 71 R/viewSelection.R        viewObject         88        88     0
## 72 R/viewSelection.R        viewObject         89        89     0
## 73 R/viewSelection.R        viewObject         91        91     0
## 74 R/viewSelection.R        viewObject         92        92     0
## 75 R/viewSelection.R        viewObject         93        93     0
## 76 R/viewSelection.R        viewObject         96        96     0

Calling covr::zero_coverage with a overage object returned by package_coverage will provide a data.frame with locations that have 0 test coverage. The nice thing about running it within RStudio is that it outputs the results on the Markers tab in RStudio, where we can easily investigate:

zeroCov <- covr::zero_coverage(covResults)
zero_coverage markers

zero_coverage markers

Test coverage for RStudio addin functions

Investigating our code, let us focus on the results for the viewSelection.R, which has a very weak 34% test coverage. We can analyze exactly which lines have no test coverage in a specific file:

zeroCov[zeroCov$filename == "R/viewSelection.R", "line"]
##  [1]  7  8  9 10 11 12 13 61 64 65 66 69 70 71 74 76 77 79 81 82 83 88 89
## [24] 91 92 93 96

Looking at the code, we can see that the first chuck of lines - 7:13 represent the viewSelection function, which just calls lapply and invisibly returns NULL. The main weak spot however is the function viewObject, out of which we only test the early return in case of invalid chr argument provided. None of the other functionality is tested.

The reason behind this is that when running the tests, RStudio functionality is not available and therefore we would not be able to test even the not-so-well designed return values, as they are almost always preceded by a call to rstudioapi or other RStudio-related functionality such as the object viewer, because that is what they are designed to do. This means we must restructure the code in such a way that we contain the RStudio-dependent functionality to a necessary minimum, keeping a big majority of the code testable - only calling the side-effecting rstudioapi when actually executing the addin functionality itself.

Rewriting an addin function for better coverage

We will now show one potential way to solve this issue for the particular case of our viewObject function.

The idea behind the solution is to only return the arguments for the call to the RStudio API related functionality, instead of executing them in the function itself - hence the rename to getViewArgs.

This way we can test the function’s return value against the expected arguments and only execute them with do.call in the addin execution wrapper itself. A picture may be worth a thousand words, so here is the diff with relevant changes:

Refactoring for testability

Refactoring for testability

Testing the rewritten function and gained coverage

Now that our return values are testable across the entire getViewArgs function, we can easily write tests to cover the entire function, a couple examples:

test_that("getViewArgs for function"
        , expect_equal(
            getViewArgs("reshape")
          , list(what = "View", args = list(x = reshape, title = "reshape"))
          )
        )
test_that("getViewArgs for data.frame"
        , expect_equal(
            getViewArgs("datasets::women")
          , list(what = "View",
                 args = list(x = data.frame(
                     height = c(58, 59, 60, 61, 62, 63, 64, 65,
                                66, 67, 68, 69, 70, 71, 72),
                     weight = c(115, 117, 120, 123, 126, 129, 132, 135,
                                139, 142, 146, 150, 154, 159, 164)
                     ),
                   title = "datasets::women"
                   )
            )
          )
        )

Looking at the test coverage provided after our changes, we can see that we are at more than 90% percent coverage for viewSelection.R:

# This looks much prettier in the R console ;)
covResults <- covr::package_coverage(pkgPath)
covResults
## jhaddins Coverage: 82.05%
## R/addRoxytag.R: 40.91%
## R/viewSelection.R: 90.57%
## R/makeCmd.R: 92.86%

And looking at the lines that not covered for viewSelection.R, we can indeed see that the only uncovered lines left are in fact those with the viewSelection function, which is responsible only for executing the addin itself:

covResults <- as.data.frame(covResults)
covResults[covResults$filename == "R/viewSelection.R" &
             covResults$value == 0, c(1:3, 5, 11)]
##             filename     functions first_line last_line value
## 39 R/viewSelection.R viewSelection         13        17     0
## 40 R/viewSelection.R viewSelection         15        15     0
## 41 R/viewSelection.R viewSelection         16        16     0

In the ideal world we would of course want to also automate the testing of our addin execution itself by examining if their effects in the RStudio IDE are as expected, however this is far beyond the scope of this post. For some of our addin functionality we can however even directly test the side-effects, such as when the addin should produce a file with certain content.

TL;DR - Just give me the package

References

Did you find the article helpful or interesting? Help others find it by sharing

To leave a comment for the author, please follow the link and comment on their blog: Jozef's Rblog.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)