# Vanilla Rao-Blackwellisation [re]revised

May 31, 2010
By

(This article was first published on Xi'an's Og » R, and kindly contributed to R-bloggers)

Although the revision is quite minor, it took us two months to complete from the time I received the news in the Atlanta airport lounge… The vanilla Rao-Blackwellisation paper with Randal Douc has thus been resubmitted to the Annals of Statistics. And rearXived. The only significant change is the inclusion of two tables detailing computing time, like the one below

$left| begin{matrix} tau &text{median} &text{mean }&q_{.8} &q_{.9} &text{time}\ 0.25 &0.0 &8.85 &4.9 &13 &4.2\ 0.50 &0.0 &6.76 &4 &11 &2.25\ 1.00 &0.25 &6.15 &4 &10 &2.5\ 2.00 &0.20 &5.90 &3.5 &8.5 &4.5\end{matrix} right|$

which provides different evaluations of the additional computing effort due to the use of the Rao–Blackwellisation: median and mean numbers of additional iterations, $80%$ and $90%$ quantiles for the additional iterations, and ratio of the average R computing times obtained over $10^5$ simulations. (Turning the above table into a formula acceptable by WordPress took me for ever, as any additional white space between the terms of the matrix is mis-interpreted!) Now, the mean time column does not look very supportive of the Rao-Blackwellisation technique, but this is due to the presence of a few outlying runs that required many iterations before hitting an acceptance probability of one. Excessive computing time can be curbed by using a pre-set number of iterations, as described in the paper…

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...

Tags: , , , , , , , , ,