Ex. 9.3

Ex. 9.3

Backfitting equations. Consider a backfitting procedure with orthogonal projections, and let \(\bb{D}\) be the overall regression matrix whose columns span \(V=\mathcal{L}_{\text{col}}(\bb{S}_1)\oplus \mathcal{L}_{\text{col}}(\bb{S}_2)\oplus\cdots\oplus \mathcal{L}_{\text{col}}(\bb{S}_p)\), where \(\mathcal{L}_{\text{col}}(\bb{S})\) denotes the column space of a matrix \(\bb{S}\). Show that the estimating equations

\[\begin{equation} \begin{pmatrix} \bb{I} & \bb{S}_1 & \bb{S}_1 & \cdots & \bb{S}_1 \\ \bb{S}_2 & \bb{I} & \bb{S}_2 & \cdots & \bb{S}_2 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \bb{S}_p & \bb{S}_p & \bb{S}_p & \cdots & \bb{I} \end{pmatrix} \begin{pmatrix} \bb{f}_1\\ \bb{f}_2\\ \vdots \\ \bb{f}_p \end{pmatrix} = \begin{pmatrix} \bb{S}_1\by\\ \bb{S}_2\by\\ \vdots\\ \bb{S}_p\by \end{pmatrix}\non \end{equation}\]

are equivalent to the least squares normal equations \(\bb{D}^T\bb{D}\beta = \bb{D}^T\by\) where \(\beta\) is the vector of coefficients.

Soln. 9.3

Recall \(\bb{S}_j\) has the form \(\bb{N}_j(\bb{N}_j^T\bb{N}_j)^{-1}\bb{N}_j^T\) where the design matrix \(\bb{N}_j\) are generated by basis spline functions. We write \(\bb{D}=(\bb{N}_1, \bb{N}_2,..., \bb{N}_p)\). The estimating equation above can be rewritten as

\[\begin{equation} \bb{f}_j + \bb{S}_j\left(\sum_{k\neq j}\bb{f}_k\right) = \bb{S}_j\by.\non \end{equation}\]

Plug in \(\bb{f}_j = \bb{N}_j\beta_j\) and \(\bb{S}_j=\bb{N}_j(\bb{N}_j^T\bb{N}_j)^{-1}\bb{N}_j^T\) and left multiply \(\bb{N}_j^T\), we obtain

\[\begin{equation} (\bb{N}_j^T\bb{N}_j)\beta_j + \bb{N}_j^T\sum_{k\neq j}\bb{N}_k\beta_k = \bb{N}_j^T\by,\non \end{equation}\]

which is

\[\begin{equation} \bb{N}_j^T\left(\sum_{k}\bb{N}_k\beta_k\right)=\bb{N}_j^T\by\non \end{equation}\]

equivalent to \(\bb{D}^T\bb{D}\beta = \bb{D}^T\by\).