Blog Archives

Bayesian First Aid: Two Sample t-test

February 24, 2014
By
Bayesian First Aid: Two Sample t-test

As spring follows winter once more here down in southern Sweden, the two sample t-test follows the one sample t-test. This is a continuation of the Bayesian First Aid alternative to the one sample t-test where I’ll introduce the two sample alternative. It will be a quite short post as the two sample alternative is just more of...

Read more »

A Significantly Improved Significance Test. Not!

February 12, 2014
By
A Significantly Improved Significance Test. Not!

It is my great pleasure to share with you a breakthrough in statistical computing. There are many statistical tests: the t-test, the chi-squared test, the ANOVA, etc. I here present a new test, a test that answers the question researchers are most anxious to figure out, a test of significance, the significance test. While a test like the two...

Read more »

Bayesian First Aid: One Sample and Paired Samples t-test

February 4, 2014
By
Bayesian First Aid: One Sample and Paired Samples t-test

Student’s t-test is a staple of statistical analysis. A quick search on Google Scholar for “t-test” results in 170,000 hits in 2013 alone. In comparison, “Bayesian” gives 130,000 hits while “box plot” results in only 12,500 hits. To be honest, if I had to choose I would most of the time prefer a notched boxplot to...

Read more »

Announcing pingr: The R Package that Sounds as it is Called

January 26, 2014
By
Announcing pingr: The R Package that Sounds as it is Called

pingr is an R package that contains one function, ping(), with one purpose: To go ping on whatever platform you are on (thanks to the audio package). It is intended to be useful, for example, if you are running a long analysis in the background and want to know when it is ready. It’s also useful if...

Read more »

Bayesian First Aid: Binomial Test

January 20, 2014
By
Bayesian First Aid: Binomial Test

The binomial test is arguably the conceptually simplest of all statistical tests: It has only one parameter and an easy to understand distribution for the data. When introducing null hypothesis significance testing it is puzzling that the binomial test is not the first example of a test but sometimes is introduced long after the t-test and the ANOVA...

Read more »

Bayesian First Aid

January 10, 2014
By
Bayesian First Aid

So I have a secret project. Come closer. I’m developing an R package that implements Bayesian alternatives to the most commonly used statistical tests. Yes you heard me, soon your t.testing days might be over! The package aims at being as easy as possible to pick up and use, especially if you are already used to the classical .test...

Read more »

An Animation of the Construction of a Confidence Interval

December 30, 2013
By
An Animation of the Construction of a Confidence Interval

I’m playing blog ping-pong with John Kruschke’s Doing Bayesian Data Analysis blog as he was partly inspired by my silly post on Bayesian mascots when writing a nice piece on Icons for the essence of Bayesian and frequentist data analysis. That piece, in turn, inspired me resulting in the following wobbly backhand. The confidence interval...

Read more »

The Mascots of Bayesian Statistics

December 25, 2013
By
The Mascots of Bayesian Statistics

Why would Bayesian statistics need a mascot/symbol/logo? Well, why not? I don’t know of any other branch of statistics that has a mascot but many programming languages have. R has an “R”, Python has a python snake, and Go has an adorable little gopher. While Bayesian statistics isn’t a programming language it could be characterized as...

Read more »

An Animation of the t Distribution as a Mixture of Normals

December 7, 2013
By
An Animation of the t Distribution as a Mixture of Normals

You’ve probably heard about the t distribution. One good use for this distribution is as an alternative to the normal distribution that is more robust against outliers. But where does the t distribution come from? One intuitive characterization of the t is as a mixture of normal distributions. More specifically, as a mixture of an infinite number of...

Read more »

Shaping up Laplace Approximation using Importance Sampling

December 2, 2013
By
Shaping up Laplace Approximation using Importance Sampling

In the last post I showed how to use Laplace approximation to quickly (but dirtily) approximate the posterior distribution of a Bayesian model coded in R. This is just a short follow up where I show how to use importance sampling as an easy method to shape up the Laplace approximation in order to approximate the true...

Read more »