Ex. 12.6

Ex. 12.6

Suppose that the regression procedure used in FDA (Section 12.5.1) is a linear expansion of basis functions \(h_m(x), m=1,...,M\). Let \(\bb{D}_\pi = \bb{Y}^T\bb{Y}/N\) be the diagonal matrix of class proportions.

(a) Show that the optimal scoring problem (12.52) can be written in vector notation as

\[\begin{equation} \min_{\theta, \beta} \|\bb{Y}\theta - \bb{H}\beta\|^2,\non \end{equation}\]

where \(\theta\) is a vector of \(K\) real numbers, and \(\bb{H}\) is the \(N\times M\) matrix of evaluations \(h_j(x_i)\).

(b) Suppose that the normalization on \(\theta\) is \(\theta^T\bb{D}_\pi 1 = 0\) and \(\theta^T\bb{D}_\pi\theta=1\). Interpret these normalizations in terms of the original scored \(\theta(g_i)\).

(c) Show that, with this normalization, (12.65) can be partially optimized w.r.t. \(\beta\), and leads to

\[\begin{equation} \max_\theta \theta^T\bb{Y}^T\bb{S}\bb{Y}\theta,\non \end{equation}\]

subject to the normalization constraints, where \(\bb{S}\) is the projection operator corresponding to the basis matrix \(H\).

(d) Suppose that the \(h_j\) include the constant function. Show that the largest eigenvalue of \(\bb{S}\) is 1.

(e) Let \(\Theta\) be a \(K\times K\) matrix of scores (in columns), and suppose the normalization is \(\Theta\bb{D}_\pi\Theta=\bb{I}\). Show that the solution to (12.53) is given by the complete set of eigenvectors of \(\bb{S}\); the first eigenvector is trivial, and takes care of the centering of the scores. The remainder characterize the optimal scoring solution.

Soln. 12.6

(a) Let \(\bb{Y}\) be an \(N\times K\) indicator response matrix with each row corresponding to the encoding of \(K\) classes, see Section 12.5.1.

Let \(h(x_i) = (h_1(x_i),...,h_M(x_i))^T\) for \(i=1,...,N\). Then we have

\[\begin{equation} \bb{Y}\theta - \bb{H}\beta = \begin{pmatrix} \theta_{j_1}\non\\ \theta_{j_2}\non\\ \vdots\non\\ \theta_{j_K} \end{pmatrix} - \begin{pmatrix} h(x_1)^T\non\\ h(x_2)^T\non\\ \vdots\non\\ h(x_N)^T \end{pmatrix}\beta,\non \end{equation}\]

where \(j_1, j_2, ...,j_K\) are determined by \(\bb{Y}\)'s encoding. Then we only need to choose \(\theta\) such that \(\theta_{j_i} = g(\theta_i)\) for \(i=1,...,N\), so that

\[\begin{equation} \bb{Y}\theta - \bb{H}\beta = \begin{pmatrix} \theta(g_1)-h(x_1)^T\beta\non\\ \theta(g_2)-h(x_2)^T\beta\non\\ \vdots\non\\ \theta(g_N)-h(x_N)^T\beta\non \end{pmatrix}\non \end{equation}\]

The rest of proof follows directly.

(b) Since

\[\begin{equation} \bb{Y}\theta = \begin{pmatrix} \theta_{g_1}\non\\ \theta_{g_2}\non\\ \vdots\non\\ \theta_{g_K} \end{pmatrix}:= \theta_g,\non \end{equation}\]

we have

\[\begin{eqnarray} \theta^T\bb{D}_\pi 1 &=& \frac{1}{N}(\bb{Y}\theta)^T\bb{Y}1\non\\ &=&\frac{1}{N}\theta_g^T1\non\\ &=&\frac{\sum_{i=1}^N\theta(g_i)}{N}=0.\non \end{eqnarray}\]

Similarly we know \(\theta^T\bb{D}_\pi\theta=1\) implies that

\[\begin{equation} \frac{\sum_{i=1}^N\theta(g_i)^2}{N} = 1.\non \end{equation}\]

(c) With \(\theta\) fixed, the minimizing \(\beta\) for the optimal scoring problem is the least square estimate

\[\begin{equation} \hat\beta = (\bb{H}^T\bb{H})^{-1}\bb{H}^T\bb{Y}\theta.\non \end{equation}\]

Then the minimization objective becomes, by noting the constraint \(\theta^T\bb{D}_\pi\theta=1\),

\[\begin{equation} \|\bb{Y}\theta - \bb{H}(\bb{H}^T\bb{H})^{-1}\bb{H}^T\bb{Y}\theta\| = N - \theta^T\bb{Y}^T\bb{H}(\bb{H}^T\bb{H})^{-1}\bb{H}^T\bb{Y}\theta, \non \end{equation}\]

which is equivalent to solving

\[\begin{equation} \max_\theta \theta^T\bb{Y}^T\bb{S}\bb{Y}\theta,\non \end{equation}\]

where

\[\begin{equation} \bb{S} = \bb{H}(\bb{H}^T\bb{H})^{-1}\bb{H}^T\non \end{equation}\]

is the projection operator (see, e.g., p.153 in the text) corresponding to the basis matrix \(\bb{H}\).

(d) Note that \(\bb{S}\) is idempotent, i.e., \(\bb{S}\bb{S} = \bb{S}\), so the largest eigenvalue of \(\bb{S}\) is 1.

(e) From (c) we know that the problem formulated in (12.53) can be written as an eigenvalue problem

\[\begin{eqnarray} && \max_\Theta \ \Theta^T\bb{Y}^T\bb{S}\bb{Y}\Theta\non\\ && s.t. \ \Theta^T\bb{D}_\pi\Theta=\bb{I}\non\\ && \ \ \ \ \ \ \ \ \Theta^T\bb{D}_\pi\bb{I} = \bb{0}\non \end{eqnarray}\]

By classical results for generalized eigenvalue problems (see, e.g., Eigenvalue and generalized eigenvalue problems: Tutorial), we know the solution to (12.53) is given by the eigenvectors of \(\bb{S}\).