R Code Example for Neural Networks

[This article was first published on Econometric Sense, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

See also NEURAL NETWORKS.

In this past June’s issue of R journal, the ‘neuralnet’ package was introduced. I had recently been familiar with utilizing neural networks via the ‘nnet’ package (see my post on Data Mining in A Nutshell) but I find the neuralnet package more useful because it will allow you to actually plot the network nodes and connections. (it may be possible to do this with nnet, but I’m not aware of how).The neuralnet package was written primarily for multilayer perceptron architectures, which may be a limitation if you are interested in other architectures.

The data set used was a default data set found in the package ‘datasets’ and consisted of 248 observations and 8 variables:

“education”      “age”            “parity”         “induced”        “case”           “spontaneous”    “stratum”        “pooled.stratum”

The following code runs the network (with 2 hidden layers) classifying ‘case’  (a binary variable) as a function of several independent varaibles. The neural network is estimated, and the results are stored in the data frame ‘nn.’

nn <- neuralnet(
 case~age+parity+induced+spontaneous,
 data=infert, hidden=2, err.fct=”ce”,
 linear.output=FALSE)

The weight estimates can be obtained with the following command:

nn$result.matrix

And, the network can be plotted or visualized with the simple command:

plot(nn)

As can be seen below, the output weights correspond directly with the visualized network. While this tool may be more difficult to interpret with more complex network architectures, I find this simplified version very useful for demonstration/instructional purposes.

Example: The weight for the path from input ‘age’ to the first hidden layer is  -3.0689 (age.to.1layhid1) which can easily be found in the network diagram. After all inputs feed into hidden layer 1, the weight associated with the path from hidden layer 1(1layhid.1.to.case) to the output layer (which along with information from the other layers of the network will give us the classification of ‘case’) is -1001.15.


Estimated Network Weights

error
123.811
reached.threshold
0.009853
steps
16822
Intercept.to.1layhid1
0.535237
age.to.1layhid1
-3.0689
parity.to.1layhid1
2.262792
induced.to.1layhid1
30.50824
spontaneous.to.1layhid1
0.000961
Intercept.to.1layhid2
-5.52491
age.to.1layhid2
0.11538
parity.to.1layhid2
-1.98372
induced.to.1layhid2
2.550932
spontaneous.to.1layhid2
3.829613
Intercept.to.case
-1.7714
1layhid.1.to.case
-1001.15
1layhid.2.to.case
4.570324
   
Neural Network Visualization (click to enlarge)
There are many other options avaiable with the neural net package, I encourage you to read the article referenced below for more details.

Reference: neuralnet: Training of Neural Networks  by Frauke G¸nther and Stefan Fritsch The R Journal Vol. 2/1, June 2010

R-Code:

#  ------------------------------------------------------------------
#  |PROGRAM NAME: NEURALNET_PKG_R
#  |DATE: 12/3/10
#  |CREATED BY: MATT BOGARD 
#  |PROJECT FILE:   P:\R  Code References\Data Mining_R           
#  |----------------------------------------------------------------
#  | PURPOSE: DEMO OF THE 'neuralnet' PACKAGE AND OUTPUT INTERPRETATION              
#  | 
#  |  ADAPTED FROM:  neuralnet: Training of Neural Networks
#  |    by Frauke Günther and Stefan Fritsch The R Journal Vol. 2/1, June 2010 
#  |    ISSN 2073-4859  (LOCATED: P:\TOOLS AND REFERENCES (Copy)\R References\Neural Networks
#  | 
#  | 
#  |------------------------------------------------------------------
#  |DATA USED: 'infert' FROM THE 'datasets' LIBRARY              
#  |------------------------------------------------------------------
#  |CONTENTS:               
#  |
#  |  PART 1: get the data 
#  |  PART 2: train the network
#  |  PART 3: 
#  |  PART 4: 
#  |  PART 5: 
#  |------------------------------------------------------------------
#  |COMMENTS:               
#  |
#  |-----------------------------------------------------------------
#  |UPDATES:               
#  |
#  |
#  |------------------------------------------------------------------
 
#  *------------------------------------------------------------------*
#  | get the data
#  *------------------------------------------------------------------* 
 
library(datasets)
 
names(infert)
 
#  *------------------------------------------------------------------*
#  | train the network
#  *------------------------------------------------------------------* 
 
library(neuralnet)
 
nn <- neuralnet(
 case~age+parity+induced+spontaneous,
 data=infert, hidden=2, err.fct="ce",
 linear.output=FALSE)
 
#  *------------------------------------------------------------------*
#  | output training results
#  *------------------------------------------------------------------*  
 
# basic
nn
 
# reults options
names(nn)
 
 
# result matrix
 
nn$result.matrix
 
# The given data is saved in nn$covariate and
# nn$response as well as in nn$data for the whole data
# set inclusive non-used variables. The output of the
# neural network, i.e. the fitted values o(x), is provided
# by nn$net.result:
 
out <- cbind(nn$covariate,nn$net.result[[1]])
 
dimnames(out) <- list(NULL, c("age", "parity","induced","spontaneous","nn-output"))
 
head(out)
 
# generalized weights
 
# The generalized weight expresses the effect of each
# ovariate xi and thus has an analogous interpretation
# as the ith regression parameter in regression models.
# However, the generalized weight depends on all
# other covariates. Its distribution indicates whether
# the effect of the covariate is linear since a small variance
# suggests a linear effect
 
# The columns refer to the four covariates age (j =
# 1), parity (j = 2), induced (j = 3), and spontaneous (j=4)
 
head(nn$generalized.weights[[1]])
 
# visualization
 
plot(nn)

To leave a comment for the author, please follow the link and comment on their blog: Econometric Sense.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)