ECE 8527/5110/4527: Introduction to
Machine Learning and Pattern Recognition

Syllabus

Contact Information:

Lecture MWF: 10:00 - 10:50 AM (ENGR 308 / Online)
Lecturer Joseph Picone, Professor
Office: ENGR 718
Office Hours: (MWF) 08:00 AM - 10:00 AM
Phone: 215-204-4841 (desk); 708-848-2846 (cell - preferred)
Email: picone@temple.edu
Zoom: picone@temple.edu or joseph.picone@gmail.com
Email Communication: tu_ece8527@listserv.temple.edu
Website http://www.isip.piconepress.com/courses/temple/ece_8527
Required Textbooks A. Lindholm, N. Wahlstrom, F. Lindsten and T. Schon,
Machine Learning: A First Course for Engineering and Scientists,
Cambridge University Press, New York, New York, USA,
ISBN: 978-1-108-84360-7, pp. 338, 2022.
URL: http://smlbook.org/book/sml-book-draft-latest.pdf

I. Drori,
The Science of Deep Learning,
Cambridge University Press, New York, New York, USA,
ISBN: 978-1-108-83508-4, pp. 339, 2023.
URL: https://www.thescienceofdeeplearning.org/
Reference Textbooks

There are a lot of textbooks available online. Most modern textbooks focus on neural networks or deep learning. While these are important topics, our goal in this course is to give you a broad perspective of the field. Technology changes quickly, so you need to have a good background in the fundamentals, and need to understand how to do more than just "button push."

Those wishing to build their theoretical background in this area will find this book useful:

      M.P. Deisenroth, A.A. Faisal and C.S. Ong
      Mathematics for Machine Learning
      Cambridge University Press, ISBN: 978-1-108-47004-9, 2020.
      URL: https://mml-book.com/

One of the best classical textbooks on this topic is:

      R.O. Duda, P.E. Hart, and D.G. Stork
      Pattern Classification
      Wiley Interscience, ISBN: 0-13-022616-5, 2001.
      URL: https://isip.piconepress.com/courses/temple/ece_8527/resources/dhs_book/

Other excellent treatments are:

      D.J.C. MacKay
      Information Theory, Inference and Learning Algorithms
      Cambridge University Press, ISBN: 978-0521642989, 2004.

Good online resources that focus more on neural networks and implementation details are:

      Michael Nielson
      Neural Networks and Deep Learning
      URL: http://neuralnetworksanddeeplearning.com

and

      Jake VanderPlas
      Python Data Science Handbook
      URL: https://github.com/jakevdp/PythonDataScienceHandbook

  Prerequisites     ECE 8527/5110:
    ENGR 5022 (minimum grade: B-)
    ENGR 5033 (minimum grade: B-)
  ECE 4527:
    ECE 3512 (minimum grade: C-)
    ECE 3522 (minimum grade: C-)  

Course Description: Please see the university bulletin for a description of the course.

University Policy Statements:: Please refer to the College of Engineering Policies and Procedures web site regarding the policies and procedures you are responsible for. The source location for this information is in the TU Portal: College of Engineering -> Advising -> Temple University Syllabus Policy.

Course Description:

Pattern recognition theory and practice is concerned with the design, analysis, and development of methods for the classification or description of patterns, objects, signals, and processes. At the heart of this discipline is our ability infer the statistical behavior of data from limited data sets, and to assign data to classes based on generalized notions of distances in a probabilistic space. Many commercial applications of pattern recognition exist today, including voice recognition (e.g., Amazon Alexa), fingerprint classification (e.g., MacBook Pro touch bar), and retinal scanners (e.g., your favorite cheesy sci-fi movie).

Machine learning is a field that is at least 50 years old. Recent advances in deep learning, starting around 2005, have have revolutionized the field. Today, machine learning is one of the most active areas of engineering and is enjoying unprecedented levels of success. However, to understand why these techniques work, we must build a background in traditional pattern recognition and machine learning concepts such as maximum likelihood decision theory, Bayesian estimation, nonparametric methods such as decision trees and support vector machines and temporal modeling approaches such as hidden Markov models. This course is designed to give you a strong background in fundamentals, yet also introduce you to the tools necessary to implement these algorithms.

Grading Policies:

Item
Weight
Exam No. 1 10%
Exam No. 2 10%
Exam No. 3 10%
Final Exam 30%
Computer Homework 40%
TOTAL: 100%


The course requirements include three in-class exams, a final exam and weekly programming projects. Students will be expected to select a dataset relevant to their research interests or to use one of the datasets available on the course web site (or from the instructor). Computer projects can be executed in any programming language. The assignment should be submitted via email as a pdf file.

Lecture Schedule:

The topics we will cover are listed below. The section number refer to sections in the Lindholm et al. book. The lecture slides and videos are available at the links provided.

  Class  
  Date  
  Topic(s)  
  Section(s)  
  Online Materials  
01
01/12
  Introduction: An Overview of Machine Learning     (LWLS) 1.1 - 1.3  
slides | code | video
02
01/14
  What Problem Are We Trying To Solve?     (LWLS) 1.1 - 1.3  
slides | code | video
03
01/16
  Machine Learning Demonstrations     IMLD  
slides | code | video
04
01/21
  Decision Theory: Bayes Rule     (DHS) 2.1 - 2.12  
slides | code | video
05
01/23
  Decision Theory: Two-Category Classification     (LWLS) 9.1, 9.2  
slides | code | video
06
01/26
  Applications of Bayesian Decision Theory     (LWLS) 9.3 - 9.6  
slides | code | video
07
01/28
  Maximum Likelihood Parameter Estimation     (DHS) 3.1, 3.2  
slides | code | video
08
01/30
  Bayesian Parameter Estimation     (DHS) 3.1 - 3.3  
slides | code | video
09
02/02
  Discriminant Analysis     (LWLS) 10.1, (DHS) 3.6 - 3.10  
slides | code | video
10
02/04
  The Expectation Maximization (EM) Theorem     (FJ) 2.7  
slides | code | video
11
02/06
  Hidden Markov Models: Introduction     (JD) 12.1-12.5  
slides | code | video
12
02/09
  Hidden Markov Models: Evaluation     (RJ) 6.1 - 6.16  
slides | code | video
13
02/11
  Exam No. 1 (Lectures 01 - 11)     Notes  
exams
14
02/13
  Review: Exam No. 1     Notes  
slides | code | video
15
02/16
  TBD      
slides | video
16
02/18
  TBD      
slides | video
17
02/20
  TBD      
slides | video
18
02/23
  TBD      
slides | video
19
02/25
  TBD      
slides | video
20
02/27
  TBD      
slides | video
21
03/09
  TBD      
slides | video
22
03/11
  TBD      
slides | video
23
03/13
  TBD      
slides | video
24
03/16
  TBD      
slides | video
25
03/18
  TBD      
slides | video
26
03/20
  Exams: Exam No. 2 (Lectures 10-24)     Notes  
exams
27
03/23
  TBD      
slides | video
28
03/25
  TBD      
slides | video
29
03/27
  TBD      
slides | video
30
03/30
  TBD      
slides | video
31
04/01
  TBD      
slides | video
32
04/03
  TBD      
slides | video
33
04/06
  TBD      
slides | video
34
04/08
  TBD      
slides | video
35
04/10
  TBD      
slides | video
36
04/13
  TBD      
slides | video
37
04/15
  TBD      
slides | video
38
04/17
  TBD      
slides | video
39
04/20
  Exam No. 3 (Lectures 25 - 37)     Notes  
exams
40
04/22
  TBD      
slides | video
41
04/24
  TBD      
slides | video
42
04/27
  TBD      
slides | video
43
05/02
  Final Exam (08:00 AM - 10:00 AM)     N/A  
exams


Please note that the dates above for exams are fixed since they have been arranged to optimize a number of constraints. You need to adjust your schedules, including job interviews and site visits, accordingly.

Homework:

The homework schedule is as follows:

  HW  
  Date  
  Item(s)  
1
01/20
  Gaussian Distributions  
2
01/26
  Bayesian Decision Theory  
3
02/02
  Probability of Error Computations  
4
02/09
  ML vs. Bayesian Parameter Estimation  
5
02/16
  Dynamic Programming  
6
02/23
  Hidden Markov Models (HHMs)  
7
03/09
  Information Theory and Statistical Significance  
8
03/16
  LDA, K-Nearest Neighbors and K-Means Clustering  
9
03/23
  Bootstrapping, Bagging and Combining Classifiers  
10
03/30
  Nonparametric Classifiers, ROC Curves and AUC  
11
04/06
  Multilayer Perceptrons  
12
04/13
  Convolutional Neural Networks  
13
04/20
  Transformers  

The homework is due by 11:59 PM on the due date shown.

Programmatic Requirements (Undergraduates):

Finally, note that Electrical and Computer Engineering students must complete one of the following concentrations. See links below for Overview, Requirements, and Academic Plan for each program with concentration: If you have questions about these concentrations, please consult with your academic advisor.