**Brian Callander**, and kindly contributed to R-bloggers)

# BDA3 Chapter 14 Exercise 3

Here’s my solution to exercise 3, chapter 14, of Gelman’s *Bayesian Data Analysis* (BDA), 3rd edition. There are solutions to some of the exercises on the book’s webpage.

We need to reexpress \((y – X\beta)^T (y – X\beta)\) as \((\mu – \beta)^T \Sigma^{-1} (\mu – \beta)\), for some \(\mu\), \(\Sigma\). Using the QR-decomposition of \(X = QR\), we see

\[

\begin{align}

(y – X\beta)^T(y – X\beta)

&=

(Q^T(y – X\beta))^TQ^T(y – X\beta)

\\

&=

(Q^Ty – Q^TX\beta)^T (Q^Ty – Q^TX\beta)

\\

&=

(Q^Ty – R\beta)^T (Q^Ty – R\beta)

,

\end{align}

\]

where \(Q\) is orthogonal and \(R\) an invertible upper triangular matrix. We can read off the minimum of this quadratic form as

\[

\hat\beta

=

R^{-1}Q^Ty

,

\]

which shows that \(\mu = \hat\beta = R^{-1}Q^Ty\). Note that

\[

\begin{align}

(X^TX)^{-1}X^T

&=

(R^TR)^{-1}R^T Q^T

\\

&=

R^{-1}R^{-T}R^T Q^T

\\

&=

R^{-1}Q^T

\end{align}

\]

so that \(\hat\beta = (X^TX)^{-1}X^Ty\).

Expanding the brackets of both quadratic form expressions and comparing the quadratic coefficients, we see that

\[

\Sigma^{-1} = R^T R = X^T X

,

\]

which shows that \(V_\beta = (X^T X)^{-1}\), in the notation of page 355.

**leave a comment**for the author, please follow the link and comment on their blog:

**Brian Callander**.

R-bloggers.com offers

**daily e-mail updates**about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...