The growing demand for fast and automated development of predictive models has contributed to the popularity of machine learning frameworks. ML frameworks allow us to quickly build models that maximize a selected performance measure. However, it turned out that as the result we are getting black-box models and too often it is difficult to detect certain problems early enough. Insufficiently tested models quickly lose their effectiveness, lead to unfair decisions, discriminate, are deferred by users, do not give the possibility of an appeal.
In order to build models responsibly, we need tools for exploration, debugging, and explanation of model predictions. There are more and more methods that can be used for this purpose. The map below divides them into three groups: (1) tools to build models that can be interpreted by design (although it is not always easy), (2) tools to explore models of specific structures, (3) universal tools to explore models in a structure agnostic fashion.
We have prepared an overview of the most popular R-packages, which can be used to build interpretable models or to explore complex ones. Examples of knitr notebooks for more than 30 packages are available at http://xai-tools.drwhy.ai/.
We hope that collecting these packages in one place will increase their visibility and thus lead to building better, more transparent, and reliable models. These examples show that individual packages are easy to use but also provide many different features.
If you see some other package that shall be included to the list or points that shall be taken into account let us know (add an issue here). Contributions are more than welcome!
If you are interested in other posts about explainable, fair, and responsible ML, follow #ResponsibleML on Medium.
In order to see more R related content visit https://www.r-bloggers.com
R packages for eXplainable Artificial Intelligence was originally published in ResponsibleML on Medium, where people are continuing the conversation by highlighting and responding to this story.