One of the practical challenges of Bayesian statistics is being able to deal with all of the complex probability distributions involved. You begin with the likelihood function of interest, but once you combine it with the prior distributions of all the parameters, you end up with a complex posterior distribution that you need to characterize. Since you usually can't calculate that distribution analytically, the best you can do is to simulate from that distribution (generally, using Markov-Chain Monte-Carlo techniques). Packages like RStan handle the simulation for you, but it's fairly easy to use the Metropolis-Hastings algorithm to code it yourself, at least for simple problems.
PhD student Maxwell Joseph did just that, using the R language. Beginning with a data set of 50 points, he set out to estimate the joint posterior distribution of the mean and variance, given simple priors (Normal for the mean; Uniform for the variance). He ran three chains of the M-H algorithm simultanously, and created the animation below. You can see each of the chains (purple, red and blue) as they progress through the joint distribution of the mean (horizontal axis) and variance (vertical axis), and see how the posterior distribution evolves over time in the 3-D image to the right.
I love the amoeba-like effect as the posterior converges to something close to a 2-D Gaussian distribution, and as you'd expect the mode of that posterior gives excellent estimates for the true mean and variance.
Maxwell shares all of the R code for setting up the likelihood and priors, running the Metropolis-Hastings chains, and animating the results at his blog, Ecology in silico. Note the use of R's system command to call ImageMagick convert to stitch individual PNG frames into the animated GIF you see above. (Another alternative is to use Yihui Xie's animations package, but the direct method works just as well.)