Ex. 12.4

Ex. 12.4

Suppose you perform a reduced-subspace linear discriminant analysis for a K-group problem. You compute the canonical variables of dimension LK1 given by z=UTx, where U is the p×L matrix of discriminat coefficients, and p>K is the dimension of x.

(a) If L=K1 show that

zz¯k2zz¯k2=xx¯kW2xx¯kW2,

where W denotes Mahalanobis distance with respect to the covariance W.

(b) If L<K1, show that the same expression on the left measures the difference in Mahalanobis squared distances for the distributions projected onto the subspace spanned by U.

Soln. 12.4

Consider the SVD W=U^DU^T, and write U^=(U^L:U^) where U^L represents the first LK1 columns and U^ the corresponding complement. It is easy to verify that

W1=(U^LD1/2)(U^LD1/2)T+(U^D1/2)(U^D1/2)T:=ULULT+UUT.

Therefore, we have

xx¯kW2=ULT(xx¯k)2+UT(xx¯k)2=zz¯k2+UT(xx¯k)2.

When L=K1, the second term vanishes, and we recover (a).

When L<K1, the first term zz¯k2 is just the Mahalanobis squared distances for the distributions projected onto the subspace spanned by U.