# Posts Tagged ‘ Bayesian statistics ’

## Core not in CiRM

July 27, 2011
By

Despite not enjoying this year the optimal environment of CiRM, we are still making good progress on the revision (or the R vision) of Bayesian Core. In the past two days, we went over Chapters 1 (Introduction), 2 (Normal Models), 5 (Capture-Recapture Experiments), and 6 (Mixture Models), with Chapters 3 (Regression), 4 (Generalised Linear Models)

## Super Sam Fuld Needs Your Help (with Foul Ball stats)

July 13, 2011
By

I was pleasantly surprised to have my recreational reading about baseball in the New Yorker interrupted by a digression on statistics. Sam Fuld of the Tampa Bay Rays, was the subjet of a Ben McGrath profile in the 4 July 2011 issue of the New Yorker, in an article titled Super Sam. After quoting a minor-league...

## The virtues of incoherence?

July 8, 2011
By

Kent Osband writes:

## Early stopping and penalized likelihood

July 6, 2011
By

Maximum likelihood gives the beat fit to the training data but in general overfits, yielding overly-noisy parameter estimates that don't perform so well when predicting new data. A popular solution to this overfitting problem takes advantage of the iterative nature of most maximum likelihood algorithms by stopping early. In general, an iterative optimization algorithm goes from a...

July 4, 2011
By

I read this article by Rivka Galchen on quantum computing. Much of the article was about an eccentric scientist in his fifties named David Deutch. I’m sure the guy is brilliant but I wasn’t particularly interested in his not particularly interesting life story (apparently he’s thin and lives in Oxford). There was a brief description

## Weighting and prediction in sample surveys

July 1, 2011
By

A couple years ago Rod Little was invited to write an article for the diamond jubilee of the Calcutta Statistical Association Bulletin. His article was published with discussions from Danny Pfefferman, J. N. K. Rao, Don Rubin, and myself. Here it all is.I'll paste my discussion below, but it's worth reading the others' perspectives too. Especially...

## Putting together multinomial discrete regressions by combining simple logits

June 29, 2011
By

When predicting 0/1 data we can use logit (or probit or robit or some other robust model such as invlogit (0.01 + 0.98*X*beta)). Logit is simple enough and we can use bayesglm to regularize and avoid the problem of separation. What if there are more than 2 categories? If they’re ordered (1, 2, 3, etc),

## Bayesian Fall school in La Rochelle

June 26, 2011
By

The French agronomy research institute INRA is organising a Fall school in La Rochelle, Nov. 28 – Dec. 02, on Bayesian methods, oriented towards the applications in food sciences, environmental sciences, and biology. The provisional program (in French) is ■ Initiation aux outils informatiques R et WinBUGS (TP et réalisation de projets sur ordinateur) ■

## Bayesian Confidence Intervals: Obama’s ‘That’-Addition and Informality

May 1, 2011
By
$Bayesian Confidence Intervals: Obama’s ‘That’-Addition and Informality$

No “That” Left Behind? I came across a post on Language Log last week giving some evidence that Obama tends to add that to the prepared version of his speeches. For example, in a recent speech at George Washington University, … Continue reading →