Posts Tagged ‘ MCMC algorithms ’

Course at Monash (#2)

July 19, 2012
By

Here are the slides for the second day of my course at Monash University, Melbourne, in the Special Lectures in Econometrics, with a strong strong similarity with the slides of my course in Roma this Spring. (Ah, sunny Roma…) The first day lecture was very well attended and I hope this remains true for the

Course at Monash (#1)

July 18, 2012
By

Here are the slides for the first day of my course at Monash University, Melbourne, in the Special Lectures in Econometrics, with a strong similarity with the slides of my course in Wharton, two years ago. (Be sure to check slide 67! If the update on slideshare works from my flat in Melbourne…) Filed under:

Andrew gone NUTS!

November 23, 2011
By

Matthew Hoffman and Andrew Gelman have posted a paper on arXiv entitled “The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo” and developing an improvement on the Hamiltonian Monte Carlo algorithm called NUTS (!). Here is the abstract: Hamiltonian Monte Carlo (HMC) is a Markov chain Monte Carlo (MCMC) algorithm that avoids the

Bayesian modeling using WinBUGS

November 6, 2011
By

Yes, yet another Bayesian textbook: Ioannis Ntzoufras’ Bayesian modeling using WinBUGS was published in 2009 and it got an honourable mention at the 2009 PROSE Award. (Nice acronym for a book award! All the mathematics books awarded that year were actually statistics books.) Bayesian modeling using WinBUGS is rather similar to the more recent Bayesian

the Wang-Landau algorithm reaches the flat histogram in finite time

October 19, 2011
By

Pierre Jacob and Robin Ryder (from Paris-Dauphine, CREST, and Statisfaction) have just arXived (and submitted to the Annals of Applied Probability) a neat result on the Wang-Landau algorithm. (This algorithm, which modifies the target in a sort of reweighted partioned sampling to achieve faster convergence, has always been perplexing to me.)  They show that some

Parallel computation [permutations]

February 19, 2011
By

François Perron is visiting me for two months from Montréal and, following a discussion about the parallel implementation of MCMC algorithms—to which he also contributed with Yves Atchadé in 2005—, he remarked that a deterministic choice of permutations with the maximal contrast should do better than random or even half-random permutations. Assuming p processors or

Model weights for model choice

February 9, 2011
By
$Model weights for model choice$

An ‘Og reader. Emmanuel Charpentier, sent me the following email about model choice: I read with great interest your critique of Peter Congdon’s 2006 paper (CSDA, 50(2):346-357) proposing a method of estimation of posterior model probabilities based on improper distributions for parameters not present in the model inder examination, as well as a more general

Questions on the parallel Rao-Blackwellisation

December 21, 2010
By
$Questions on the parallel Rao-Blackwellisation$

Pierre Jacob and I got this email from a student about our parallel Rao-Blackwellisation paper. Here are some parts of the questions and our answer: Although I understand how the strategy proposed in the paper helps in variance reduction, I do not understand why you set b=1 (mentioned in Section 3.2) and why it plays

Bayesian model selection

December 7, 2010
By

Last week, I received a box of books from the International Statistical Review, for reviewing them. I thus grabbed the one whose title was most appealing to me, namely Bayesian Model Selection and Statistical Modeling by Tomohiro Ando. I am indeed interested in both the nature of testing hypotheses or more accurately of assessing models,