Simple and heuristic optimization

June 29, 2012

(This article was first published on Freakonometrics - Tag - R-english, and kindly contributed to R-bloggers)

This week, at the Rmetrics conference, there has been an interesting discussion about heuristic optimization. The starting point was simple: in complex optimization problems (here
we mean with a lot of local maxima, for instance), we do not necessarily need
extremely advanced algorithms that do converge extremly fast, if we
cannot ensure that they reach the optimum. Converging extremely fast,
with a great numerical precision to some point (that is not the point
we’re looking for) is useless. And some algorithms might be much slower, but at least, it is much more
likely to converge to the optimum. Wherever we start from.
We have experienced that with Mathieu, while we were looking for maximum
likelihood of our MINAR process: genetic algorithm have performed extremely well. The idea is extremly simple, and natural. Let us
consider as a starting point the following algorithm,

  1. Start from some 
  2. At step , draw a point  in a neighborhood of  
  • either  then  
  • or  then 

This is simple (if you do not enter into details about what such a
neighborhood should be). But using that kind of algorithm, you might get trapped and attracted to some
local optima if the neighborhood is not large enough. An alternative to
this technique is the following: it might be interesting to change a bit more, and instead of changing when we have a maximum, we change if we have almost
a maximum. Namely at step ,

  • either then  
  • or  then 

for some . To illustrate the idea, consider the following function

> f=function(x,y) { r <- sqrt(x^2+y^2);
+ 1.1^(x+y)*10 * sin(r)/r }
(on some bounded support). Here, by picking noise and  values arbitrary, we have obtained the following scenarios
> x0=15
> MX=matrix(NA,501,2)
> MX[1,]=runif(2,-x0,x0)
> k=.5
> for(s in 2:501){
+  bruit=rnorm(2)
+  X=MX[s-1,]+bruit*3
+  if(X[1]>x0){X[1]=x0}
+  if(X[1]<(-x0)){X[1]=-x0}
+  if(X[2]>x0){X[2]=x0}
+  if(X[2]<(-x0)){X[2]=-x0}
+  if(f(X[1],X[2])+k>f(MX[s-1,1],
+    MX[s-1,2])){MX[s,]=X}
+  if(f(X[1],X[2])+k<=f(MX[s-1,1],
+    MX[s-1,2])){MX[s,]=MX[s-1,]}


It does not always converge towards the optimum,

and sometimes, we just missed it after being extremely unlucky

Note that if we run 10,000 scenarios (with different random noises and starting point), in 50% scenarios, we
reach the maxima. Or at least, we are next to it, on top.
What if we compare with a standard optimization routine, like
Nelder-Mead, or quasi gradient ?Since we look for the maxima on a
restricted domain, we can use the following function,

> g=function(x) f(x[1],x[2])
> optim(X0, g,method="L-BFGS-B",
+ lower=-c(x0,x0),upper=c(x0,x0))$par

In that case, if we run the algorithm with 10,000 random starting point, this is where we end, below on the right (while the heuristic technique is on the left),

In only 15% of the scenarios, we have been able to reach the region where the maximum is.
So here, it looks like an heuristic method works extremelly well, if do not need to reach the maxima with a great precision. Which is usually the case actually.

To leave a comment for the author, please follow the link and comment on their blog: Freakonometrics - Tag - R-english. offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...

If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Tags: , , , , ,

Comments are closed.


Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)