Site icon R-bloggers

Machine Learning Ex2 – Linear Regression

[This article was first published on YGC » R, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Thanks to this post, I found OpenClassroom. In addition, thanks to Andrew Ng and his lectures, I took my first course in machine learning. These videos are quite easy to follow. Exercise 2 requires implementing gradient descent algorithm to model data with linear regression.

?View Code RSPLUS
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
gradDescent <- function(x, y, alpha=0.07, niter=1500, eps=1e-9) {
	x <- cbind(rep(1, length(x)), x)
	theta.old <- rep(0, ncol(x))
	m <- length(y)
	for (i in 1:niter) {
		theta <- gradDescent_internal(theta.old, x, y, m)
		if (all(abs(theta - theta.old) <= eps)) {
			break
		} else {
			theta.old <- theta
		}
	}
	return(theta)
}
 
gradDescent_internal <- function(theta, x, y, m) {
	h <- sapply(1:nrow(x), function(i) theta %*% x[i,])
	j <- (h-y) %*% x
	grad <- 1/m * j
	theta <- theta - alpha * grad 
	return(theta)
}
 
require(ggplot2)
x <- read.table("ex2x.dat", header=F)
y <- read.table("ex2y.dat", header=F)
x <- x[,1]
y <- y[,1]
p <- ggplot() + aes(x, y) + geom_point() + xlab("Age in years") + ylab("Height in meters")
 
theta <- gradDescent(x,y)
 
yy <- theta[1] + theta[-1] %*% t(x)
yy <- as.vector(yy)
predicted <- data.frame(x=x, y=yy)
p+geom_line(data=predicted, aes(x=x,y=y))

Related Posts

To leave a comment for the author, please follow the link and comment on their blog: YGC » R.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.