Ex. 9.3
Ex. 9.3
Backfitting equations. Consider a backfitting procedure with orthogonal projections, and let \(\bb{D}\) be the overall regression matrix whose columns span \(V=\mathcal{L}_{\text{col}}(\bb{S}_1)\oplus \mathcal{L}_{\text{col}}(\bb{S}_2)\oplus\cdots\oplus \mathcal{L}_{\text{col}}(\bb{S}_p)\), where \(\mathcal{L}_{\text{col}}(\bb{S})\) denotes the column space of a matrix \(\bb{S}\). Show that the estimating equations
are equivalent to the least squares normal equations \(\bb{D}^T\bb{D}\beta = \bb{D}^T\by\) where \(\beta\) is the vector of coefficients.
Soln. 9.3
Recall \(\bb{S}_j\) has the form \(\bb{N}_j(\bb{N}_j^T\bb{N}_j)^{-1}\bb{N}_j^T\) where the design matrix \(\bb{N}_j\) are generated by basis spline functions. We write \(\bb{D}=(\bb{N}_1, \bb{N}_2,..., \bb{N}_p)\). The estimating equation above can be rewritten as
Plug in \(\bb{f}_j = \bb{N}_j\beta_j\) and \(\bb{S}_j=\bb{N}_j(\bb{N}_j^T\bb{N}_j)^{-1}\bb{N}_j^T\) and left multiply \(\bb{N}_j^T\), we obtain
which is
equivalent to \(\bb{D}^T\bb{D}\beta = \bb{D}^T\by\).