Ex. 3.7

Ex. 3.7

Assume \(y_i\sim N(\beta_0+x_i^T\beta, \sigma^2)\), \(i=1, 2,..., N\), and the parameters \(\beta_j\) are each distributed as \(N(0, \tau^2)\), independently of one another. Assuming \(\sigma^2\) and \(\tau^2\) are known, show that the (minus) log-posterior density of \(\beta\) is proportional to \(\sum_{i=1}^N(y_i-\beta_0-\sum_{j}x_{ij}\beta_j)^2 + \lambda\sum_{j=1}^p\beta_j^2\) where \(\lambda = \sigma^2/\tau^2\).

Warning

The claim above does not seem to be correct.

Soln. 3.7

By Bayes' theorem we have

\[\begin{eqnarray} P(\beta|\textbf{y}) &=& \frac{P(\textbf{y}|\beta)P(\beta)}{P(\textbf{y})}.\label{eq:3-7a} \end{eqnarray}\]

By assumptions here we have

\[\begin{eqnarray} P(\textbf{y}|\beta) &=&\frac{1}{(2\pi)^{N/2} \sigma^N} \exp\left(-\frac{1}{2\sigma^2}\sum_{i=1}^N(y_i-\beta_0-\sum_{j}x_{ij}\beta_j)^2\right),\nonumber\\ P(\beta) &=& \frac{1}{(2\pi)^{p/2} \sigma^p}\exp\left(-\frac{1}{\tau^2}\sum_{j=1}^p\beta_j^2\right).\nonumber \end{eqnarray}\]

Therefore, with \(\lambda = \sigma^2/\tau^2\), from \(\eqref{eq:3-7a}\) we have

\[\begin{eqnarray} -\ln(P(\beta|\textbf{y})) &=&\frac{1}{2\sigma^2}\left(\sum_{i=1}^N(y_i-\beta_0-\sum_{j}x_{ij}\beta_j)^2 + \lambda\sum_{j=1}^p\beta_j^2\right) + C,\nonumber \end{eqnarray}\]

where \(C\) is a constant independent of \(\beta\).

The claim is true if and only if \(C = 0\), which is not the case here.