Site icon R-bloggers

Plotting Decision Trees in R with rpart and rpart.plot

[This article was first published on Steve's Data Tips and Tricks, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
< section id="introduction" class="level1">

Introduction

Decision trees are a powerful machine learning algorithm that can be used for both classification and regression tasks. They are easy to understand and interpret, and they can be used to build complex models without the need for feature engineering.

Once you have trained a decision tree model, you can use it to make predictions on new data. However, it can also be helpful to plot the decision tree to better understand how it works and to identify any potential problems.

In this blog post, we will show you how to plot decision trees in R using the rpart and rpart.plot packages. We will also provide an extensive example using the iris data set and explain the code blocks in simple to use terms.

< section id="example" class="level1">

Example

< section id="load-the-libraries" class="level2">

Load the libraries

library(rpart)
library(rpart.plot)
< section id="split-the-data-into-training-and-test-sets" class="level2">

Split the data into training and test sets

set.seed(123)
train_index <- sample(1:nrow(iris), size = 0.7 * nrow(iris))
train <- iris[train_index, ]
test <- iris[-train_index, ]
< section id="train-a-decision-tree-model" class="level2">

Train a decision tree model

tree <- rpart(Species ~ ., data = train, method = "class")
< section id="plot-the-decision-tree" class="level2">

Plot the decision tree

rpart.plot(tree, main = "Decision Tree for the Iris Dataset")

< section id="output" class="level1">

Output

The output of the rpart.plot() function is a tree diagram that shows the decision rules of the model. The root node of the tree is at the top, and the leaf nodes are at the bottom. Each node is labeled with the feature that is used to split the data at that node, and the value of the split. The leaf nodes are labeled with the predicted class for the data that reaches that node.

< section id="interpreting-the-decision-tree" class="level1">

Interpreting the decision tree

To interpret the decision tree, start at the root node and follow the branches down to a leaf node. The leaf node that you reach is the predicted class for the data that you started with.

For example, if you have a new iris flower with a sepal length of 5.5 cm and a petal length of 2.5 cm, you would start at the root node of the decision tree. At the root node, the feature that is used to split the data is petal length. Since the petal length of the new flower is greater than 2.45 cm, you would follow the right branch down to the next node. At the next node, the feature that is used to split the data is sepal length. Since the sepal length of the new flower is greater than 5.0 cm, you would follow the right branch down to the leaf node. The leaf node that you reach is labeled “versicolor”, so the predicted class for the new flower is versicolor.

< section id="trying-it-on-your-own" class="level1">

Trying it on your own

Now that you have learned how to plot decision trees in R, try it out on your own. You can use the iris data set or your own data set.

To get started, load the rpart and rpart.plot libraries and load your data set. Then, split the data into training and test sets. Train a decision tree model using the rpart() function. Finally, plot the decision tree using the rpart.plot() function.

Once you have plotted the decision tree, take some time to interpret it. Try to understand how the model makes predictions and to identify any potential problems. You can also try to improve the model by adding or removing features or by changing the hyperparameters of the rpart() function.

< section id="conclusion" class="level1">

Conclusion

Plotting decision trees is a great way to better understand how they work and to identify any potential problems. It is also a helpful way to communicate the results of a decision tree model to others.

In this blog post, we showed you how to plot decision trees in R using the rpart and rpart.plot packages. We also provided an extensive example using the iris data set and explained the code blocks in simple to use terms.

We encourage you to try plotting decision trees on your own data sets. It is a great way to learn more about decision trees and to improve your machine learning skills.

To leave a comment for the author, please follow the link and comment on their blog: Steve's Data Tips and Tricks.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
Exit mobile version