recycling accept-reject rejections (#2)

July 1, 2014
By

(This article was first published on Xi'an's Og » R, and kindly contributed to R-bloggers)

Following yesterday’s post on Rao’s, Liu’s, and Dunson’s paper on a new approach to intractable normalising constants, and taking advantage of being in Warwick, I tested the method on a toy model, namely the posterior associated with n Student’s t observations with unknown location parameter μ and a flat prior,

x_1,\ldots,x_n \sim p(x|\mu) \propto \left[ 1+(x-\mu)^2/\nu \right]^{-(\nu+1)/2}

which is “naturally” bounded by a Cauchy density with scale √ν. The constant M is then easily derived and running the new algorithm follows from a normal random walk proposal targeting the augmented likelihood (R code below).

dunsonAs shown by the above graph, the completion-by-rejection scheme produces a similar outcome (tomato) as the one based on the sole observations (steelblue). With a similar acceptance rate. However, the computing time is much much degraded:

> system.time(g8())
   user  system elapsed
 53.751   0.056  54.103
> system.time(g9())
   user  system elapsed
  1.156   0.000   1.161

when compared with the no-completion version. Here is the entire R code that produced both MCMC samples:

#Student t observations and flat prior
nu=4
n=25
M=pi*sqrt(nu)
sqrtnu=sqrt(nu)
obs=rt(n,df=4)
sdobs=sd(obs)

#unormalised t
mydt=function(x,mu){
  return(dt(x-mu,df=nu)/dt(0,df=nu))}
mydtc=cmpfun(mydt)

mydcauchy=function(x,mu){
  y=(x-mu)/sqrtnu
  return(dcauchy(y)/sqrtnu)}
mydcaucchy=cmpfun(mydcauchy)

#augmented data
augmen=function(mu){
y=NULL
for (i in 1:n){
  prop=mu+rcauchy(1)*sqrtnu
  reject=(runif(1)propdens-refdens) theta[t]=theta[t-1]
   }
return(theta)}
g8=cmpfun(gibbsda)

gibbs2=function(T=10^4){
eta=rep(0,T)
for (t in 2:T){
   eta[t]=prop=eta[t-1]+rnorm(1,sd=sdobs)
   propdens=sum(dt(obs-prop,df=nu,log=TRUE))
   refdens=sum(dt(obs-eta[t-1],df=nu,log=TRUE))
   if (log(runif(1))>propdens-refdens) eta[t]=eta[t-1]
   }
return(eta)}
g9=cmpfun(gibbsda)

Filed under: R, Statistics, University life Tagged: accept-reject algorithm, compiler, Data augmentation, Gibbs sampling, MCMC, Monte Carlo Statistical Methods, Student’s t distribution

To leave a comment for the author, please follow the link and comment on their blog: Xi'an's Og » R.

R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...



If you got this far, why not subscribe for updates from the site? Choose your flavor: e-mail, twitter, RSS, or facebook...

Comments are closed.

Sponsors

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)