The goal of this assignment is to teach you the fundamentals of
pattern recognition and classification. We will focus on two techniques
known as Principal Component Analysis (PCA) and
Linear Discriminant Analysis (LDA). We will make use of the
Java pattern recognition applet
for these experiments.
Using our Gaussian random vector generator, generate distributions
for two Gaussian random vectors of dimension 2 with an indentity
covariance matrix, and means of (1,1) and (-1,1) respectively.
Plot them.
Draw the support regions for these distributions.
Perform a class-independent PCA on these sets and classify the data.
Draw the support region for the pooled covariance matrix and
the decision surface. Explain the results.
Perform a class-dependent PCA for the above and explain the
results.
Repeat the previous steps for LDA and explain the results.
Change the covariance matrix on each distribution to one that
generates an elliptical support region whose major axis makes
a 45 degree angle with the horizontal axis.
Repeat the previous steps (PCA and LDA) for this data set, and
explain the results.
The Java applet allows you to build many such interesting classification
experiments. We encourage you to experiment with the applet to learn
more about the differences between PCA and LDA.