SYLLABUS

Contact Information:

Time MWF: 8 - 9 AM
Place 213 Simrall
Instructor Joseph Picone, Professor
Office 413 Simrall
Office Hours 9-10 MWF (others by appt.)
Email picone@cavs.msstate.edu
Class Alias ece_8443@cavs.msstate.edu
URL http://www.cavs.msstate.edu/research/isip/publications/courses/ece_8443
Textbook R.O. Duda, P.E. Hart, and D.G. Stork, Pattern Classification, Second Edition, Wiley Interscience, ISBN: 0-471-05669-3, 2000 (supporting material available at http://rii.ricoh.com/~stork/DHS.html)
Suggested Reference D. MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003 (also available at http://www.inference.phy.cam.ac.uk/mackay/itprnn/book.html)
Prerequisite Statistics, Signal Processing, many years of research, and attendance at a lecture where the speaker proudly exclaimed "PCA Fails!"
Other Reference Materials Pattern Recognition in Java: an applet designed to demonstrate fundamental concepts of pattern recognition.
Internet Links: Godfried T. Toussaint's online resources that include lots of valuable links and course notes.
Human Computer Interface Design: a new multi-university course sponsored by the National Science Foundation.


Grading Policies:

Item: Date: Weight:
  Exam No. 1   02/16   25%
  Exam No. 2   03/12   25%
  Exam No. 3   04/23   25%
  Final Exam (Paper)   05/04 (3 PM to 6 PM)   25%


Exams:

We will have three in-class exams in this course. Each will be closed books and notes. You will be allowed one page (double-sided) of notes. Calculators or other computing devices will not be allowed (or needed). The exams will require critical thinking. They will consist of one to two problems for which you will supply extended written solutions. The quality and clarity of your solution will be as important as whether you obtained the correct answer.

Final Exam:

For the final exam, you will select an algorithm, perform a blind evaluation of the algorithm using the data provided by the instructor, and write a four-page paper analyzing the results. Grades will be assigned based on how well you do relative to published performance and your peers, as well as the quality of your analysis. This will be explained in greater detail in class.

Attendance / Survival:

Attendance does not figure directly into your grade. However, historically students who do not attend class regularly do not do well in this class. Further, most students end up with a grade on a borderline at the end of the semester. In this case, I use attendance and classroom participation to determine the final grade.

Schedule:

Class Date Sections Topic
1 01/12 1.1 Course Overview
2 01/14 1.2, 1.3 Introduction
3 01/16 1.4 - 1.6 Typical Applications
4 01/21 2.1 Bayes' Decision Theory
5 01/23 2.2, 2.3 Continuous Features; Minimum Classification Error
6 01/26 2.4 Decision Surfaces; Normal Distributions
7 01/28 2.5 Support Regions; Whitening Transformations
8 01/30 2.6 Discriminant Functions for the Normal Density
9 02/02 2.8, 2.9 Error Bounds; Discrete Features
10 02/04 3.1-3.2 Maximum Likelihood Parameter Estimation
11 02/06 3.4 Bayesian Parameter Estimation - Univariate
12 02/09 3.4 Bayesian Parameter Estimation - Multivariate
13 02/11 3.5 Bayesian Parameter Estimation - General Theory
14 02/13 3.6, 3.7 Problems of Dimensionality
15 02/16 Exam No. 1 Sections 1.1 - 2.9
16 02/18 3.8 Multiple Discriminant Analysis
17 02/20 3.9 The Expectation Maximization Algorithm (EM)
18 02/23 3.10 What is a "hidden" Markov model?
19 02/25 3.10 Fundamentals of HMMs
20 02/27 3.10 HMM Paramter Estimation
21 03/01 N/A Review of Chapter 2
22 03/03 N/A Review of Chapter 2
23 03/05 N/A Review of Chapter 2
24 03/08 N/A Review of Chapter 3
25 03/10 N/A Review of Chapter 3
26 03/12 Exam No. 2 Sections 3.1 - 3.10
27 03/22 4.1 - 4.3 Parzen Windows
28 03/24 4.4 - 4.9 k-Nearest Neighbor
29 03/26 5.1 - 5.3 Linear Discriminant Functions
30 03/29 5.11 Support Vector Machines


23 03/05 5.7 - 5.12 Support Vector Machines
24 03/08 6.1 - 6.3 Multilayer Networks, Backpropagation
25 03/10 6.4 - 6.9 Bayes Theory, Practical Issues
27 03/22 7.1 - 7.2 Simulated Annealing
28 03/24 7.3 - 7.4 Boltzmann Machines
29 03/26 7.5 - 7.6 Evolutionary Methods
30 03/29 8.1 - 8.2 Decision Trees
31 03/31 8.3 - 8.4 CART
32 04/02 8.5 String Matching
33 04/05 8.6 Grammatical Methods
34 04/07 8.7 Rule Based Systems
35 04/12 9.1 Occam's Razor
36 04/14 9.2 No Free Lunch Theorem
37 04/16 9.3 Minimum Description Length
38 04/19 9.4 - 9.5 Resampling Techniques
39 04/21 9.6 - 9.7 Comparing and Combining Classifiers
40 04/23 Exam No. 3 TBD
41 04/26 10.4 - 10.7 Clustering
42 04/28 10.8 - 10.14 Advanced Clustering Techniques
43 04/30 Review Comprehensive Review
44 05/04 Final Exam 3 PM to 6 PM


Homework:

Homework will be assigned but not graded. Solutions will be available online.

Number Due Date Chapter/Problem Student(s)
1 01/23 Chapter 1: pattern recognition examples Shrestha, Raghavan, Menon, Anand, Chandra
2 01/30 Chapter 2: 2, 3, 8, 10 Gao, Stanley
3 02/02 Chapter 2: 15, 21, 23, 24 Menon, Shrestha
4 02/09 Chapter 2: 37, 40, 43, 44 Anand, Raghavan
5 02/16 Chapter 3: 1, 4, 11, 13, 16 Chandra, Gao
6 02/23 Chapter 3: 17, 24, 38, 41, 50 Stanley, Menon
7 03/29 Chapter 4: 2, 3, 10, 13, 14 Shrestha, Anand
8 TBD TBD TBD
9 TBD TBD TBD
10 TBD TBD TBD