Ex. 17.10

Ex. 17.10

Using a simple binary graphical model with just two variables, show why it is essential to include a constant node \(X_0\equiv 1\) in the model.

Soln. 17.10

Suppose we only have two variables \(X_1\) and \(X_2\). When \(X_1\) and \(X_2\) are not connected, the joint probability in (17.28) reduces to a constant 1. When a constant node \(X_0\equiv 1\) is included and is always connected to \(X_1\) and \(X_2\), then the join probability becomes

\[\begin{eqnarray} p(\bb{X},\bm{\Theta}) &=& \exp\left(\theta_{01}X_1+\theta_{02}X_2 - \log\left(\exp(\theta_{01}) + \exp(\theta_{02} + \exp(\theta_{01} + \theta_{02}))\right)\right)\non\\ &=&\exp\left(\theta_{01}X_1 + \theta_{02}X_2\right)/[1+\exp(\theta_{01})+\exp(\theta_{02}) + \exp(\theta_{01} + \theta_{02})].\non \end{eqnarray}\]