Machine Learning Ex2 – Linear Regression

March 22, 2011
By

(This article was first published on YGC » R, and kindly contributed to R-bloggers)

Thanks to this post, I found OpenClassroom. In addition, thanks to Andrew Ng and his lectures, I took my first course in machine learning. These videos are quite easy to follow. Exercise 2 requires implementing gradient descent algorithm to model data with linear regression.

?View Code RSPLUS
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
gradDescent <- function(x, y, alpha=0.07, niter=1500, eps=1e-9) {
	x <- cbind(rep(1, length(x)), x)
	theta.old <- rep(0, ncol(x))
	m <- length(y)
	for (i in 1:niter) {
		theta <- gradDescent_internal(theta.old, x, y, m)
		if (all(abs(theta - theta.old) <= eps)) {
			break
		} else {
			theta.old <- theta
		}
	}
	return(theta)
}
 
gradDescent_internal <- function(theta, x, y, m) {
	h <- sapply(1:nrow(x), function(i) theta %*% x[i,])
	j <- (h-y) %*% x
	grad <- 1/m * j
	theta <- theta - alpha * grad 
	return(theta)
}
 
require(ggplot2)
x <- read.table("ex2x.dat", header=F)
y <- read.table("ex2y.dat", header=F)
x <- x[,1]
y <- y[,1]
p <- ggplot() + aes(x, y) + geom_point() + xlab("Age in years") + ylab("Height in meters")
 
theta <- gradDescent(x,y)
 
yy <- theta[1] + theta[-1] %*% t(x)
yy <- as.vector(yy)
predicted <- data.frame(x=x, y=yy)
p+geom_line(data=predicted, aes(x=x,y=y))

Related Posts

To leave a comment for the author, please follow the link and comment on his blog: YGC » R.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...



If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Tags: , , ,

Comments are closed.