# Posts Tagged ‘ mixtures ’

April 14, 2011
By

For my second lecture today, I need to plot a likelihood surface for a basic two-component mixture with only the means unknown: here is the R code to speed up things llsurf=function(trumyn=2.,wayt=.3,var2=1.,ssiz=500){ # draws the log-likelihood surface and a random sample sd2=sqrt(var2) parti=(runif(ssiz)>wayt) sampl=(1-parti)*rnorm(ssiz)+parti*(trumyn+sd2*rnorm(ssiz)) mu2=mu1=seq(min(sampl),max(sampl),.1) mo1=mu1%*%t(rep(1,length(mu2))) mo2=(rep(1,length(mu2)))%*%t(mu2) ca1=-0.5*mo1*mo1 ca2=-0.5*mo2*mo2 like=.1*(ca1+ca2) # log prior N(0,10) for

April 10, 2011
By

As I already did two years ago, in connection with the double degree between UAM and Dauphine, I will give a short graduate course at the Universidad Autonoma de Madrid (UAM). It will be part of the regular fourth year statistics course and will focus on mixtures, as given in of Bayesian Core. It will

## Stochastic approximation in mixtures

February 22, 2011
By

On Friday, a 2008 paper on Stochastic Approximation and Newton’s Estimate of a Mixing Distribution by Ryan Martin and J.K. Ghosh was posted on arXiv. (I do not really see why it took so long to post on arXiv a 2008 Statistical Science paper but given that it is not available on project Euclid, it

## Particle learning [rejoinder]

November 9, 2010
By

Following the posting on arXiv of the Statistical Science paper of Carvalho et al., and the publication by the same authors in Bayesian Analysis of Particle Learning for general mixtures I noticed on Hedibert Lopes’ website his rejoinder to the discussion of his Valencia 9 paper has been posted. Since the discussion involved several points

## JSM 2010 [day 1]

August 2, 2010
By

The first day at JSM is always a bit sluggish, as people slowly drip in and get their bearings. Similar to last year in Washington D.C., the meeting takes place in a huge conference centre and thus there is no feeling of overcrowded . It may also be that the peripheric and foreign location

## A summer of books

July 27, 2010
By

The summer started with a research in pair session in CiRM on the R edition of Bayesian Core, but I am also involved two other book projects. The first one was mentioned in a previous post, namely the translation of Introducing Monte Carlo Methods with R into French. I have now recovered all translated chapters,

## Typo in Bayesian Core [again]

May 15, 2010
By

Reza Seirafi from Virginia Tech sent me the following email about Bayesian Core, which alas is pointing out a real typo in the reversible jump acceptance probability for the mixture model: With respect to the expression provided on page 178 for the acceptance probability of the split move, I was wondering if the omission of

## Posterior likelihood

March 6, 2010
By

At the Edinburgh mixture estimation workshop, Murray Aitkin presented his proposal to compare models via the posterior distribution of the likelihood ratio. As already commented in a post last July, the positive aspect of looking at this quantity rather than at the Bayes factor is that the priors are then allowed to be improper if