Ex. 3.10
Ex. 3.10
Backward stepwise regression. Suppose we have the multiple regression fit of \(\textbf{y}\) on \(\textbf{X}\), along with the standard errors and \(Z\)-scores as in Table 3.2. We wish to establish which variable, when dropped, will increase residual sum-of-squares the least. How would you do this?
Soln. 3.10
Let's follow the ideas in Ex 3.9. Suppose \(\textbf{x}_k\) is the last predictor we added, and the decrease in sum-of-residuals is \((\textbf{q}_k^T\textbf{y})^2\). On the other hand, the \(Z\)-score for the coefficient \(\hat{\beta}_p\) is, e.g., see (3.28)-(3.29) in the text,
\[\begin{equation}
Z_k = \frac{\textbf{q}_k^T\textbf{y}}{\hat\sigma}.\nonumber
\end{equation}\]
Therefore, we choose the predictor with the smallest absolute value of \(Z\)-score to drop.
Remark
Alternatively, we can use Ex 3.1 to arrive at the same result.