A heuristic enhancement of optimisation algorithm

[This article was first published on Fiddling with data and code » R, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

The majority of the world’s problem deal with directly or indirectly some kind of optimisation. Instance of optimisation of resources or utility function can be seen our daily life. Here I am talking about standard optimisation problem in statistics, maximum likelihood estimate. This blog is a small episode of my recent work. I used R for this optimisation exercise.

There are packages available for optimisation in R. The mostly used are: optim() and  optimize() are in stats package. A dedicated function mle() available in stats4 package. This is very useful most of the mean part modeling(eg: glm). What we need for this optimisation is to prepare function for the (log)likelihood and gradient (or hessian). We can also specify the algorithm(methods) as any of the following: “Nelder-Mead”, “BFGS”, “CG”, “L-BFGS-B”, “SANN”, and   “Brent”. This opimisation output give the necessary information for the model estimation.

The case I was doing is an extension of bivariate garch model. It includes more than 20 parameters. let me focus pure garch model, garch(1,1), as a prototype to explain the algorithm. Here the model equation is in the variance part. Also the parameter has constraint to ensure the positive variance. The   model equation is given below.


A heuristic enhancement of optimisation algorithm appeared first on Fiddling with data and code.

To leave a comment for the author, please follow the link and comment on their blog: Fiddling with data and code » R.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)