ECE 8527/4527: Introduction to
Machine Learning and Pattern Recognition

Syllabus

Contact Information:

Lecture MWF: 10:00 - 10:50 AM (ENGR 308 / Online)
Lecturer Joseph Picone, Professor
Office: ENGR 718
Office Hours: (MWF) 08:00 AM - 10:00 AM
Phone: 215-204-4841 (desk); 708-848-2846 (cell - preferred)
Email: picone@temple.edu
Zoom: picone@temple.edu or joseph.picone@gmail.com
Email Google Group URL: https://groups.google.com/g/temple_engineering_ece8527
Google Group Email: temple_engineering_ece8527@googlegroups.com
Website http://www.isip.piconepress.com/courses/temple/ece_8527
Required Textbook A. Lindholm, N. Wahlstrom, F. Lindsten and T. Schon,
Machine Learning: A First Course for Engineering and Scientists,
Cambridge University Press, New York, New York, USA,
ISBN: 978-1-108-84360-7, pp. 338, 2022.
URL: http://smlbook.org/book/sml-book-draft-latest.pdf
Reference Textbooks

There are a lot of textbooks available online. Most modern textbooks focus on neural networks or deep learning. While these are important topics, our goal in this course is to give you a broad perspective of the field. Technology changes quickly, so you need to have a good background in the fundamentals, and need to understand how to do more than just "button push."

Those wishing to build their theoretical background in this area will find this book useful:

      M.P. Deisenroth, A.A. Faisal and C.S. Ong
      Mathematics for Machine Learning
      Cambridge University Press, ISBN: 978-1-108-47004-9, 2020.
      URL: https://mml-book.com/

One of the best classical textbooks on this topic is:

      R.O. Duda, P.E. Hart, and D.G. Stork
      Pattern Classification
      Wiley Interscience, ISBN: 0-13-022616-5, 2001.
      URL: https://isip.piconepress.com/courses/temple/ece_8527/resources/dhs_book/

Other excellent theoretical treatments are:

      C.M. Bishop
      Pattern Recognition and Machine Learning
      Springer, ISBN: 978-0387310732, 2003.

      D.J.C. MacKay
      Information Theory, Inference and Learning Algorithms
      Cambridge University Press, ISBN: 978-0521642989, 2004.

Good online resources that focus more on neural networks and implementation details are:

      Michael Nielson
      Neural Networks and Deep Learning
      URL: http://neuralnetworksanddeeplearning.com

and

      Jake VanderPlas
      Python Data Science Handbook
      URL: https://github.com/jakevdp/PythonDataScienceHandbook

  Prerequisites     ECE 8527:
    ENGR 5022 (minimum grade: B-)
    ENGR 5033 (minimum grade: B-)
  ECE 4527:
    ECE 3512 (minimum grade: C-)
    ECE 3522 (minimum grade: C-)  

Course Description: Please see the university bulletin for a description of the course.

University Policy Statements:: Please refer to the College of Engineering Policies and Procedures web site regarding the policies and procedures you are responsible for. The source location for this information is in the TU Portal: College of Engineering -> Advising -> Temple University Syllabus Policy.

Course Description:

Pattern recognition theory and practice is concerned with the design, analysis, and development of methods for the classification or description of patterns, objects, signals, and processes. At the heart of this discipline is our ability infer the statistical behavior of data from limited data sets, and to assign data to classes based on generalized notions of distances in a probabilistic space. Many commercial applications of pattern recognition exist today, including voice recognition (e.g., Amazon Alexa), fingerprint classification (e.g., MacBook Pro touch bar), and retinal scanners (e.g., your favorite cheesy sci-fi movie).

Machine learning is a field that is at least 50 years old. Recent advances in deep learning, starting around 2005, have have revolutionized the field. Today, machine learning is one of the most active areas of engineering and is enjoying unprecedented levels of success. However, to understand why these techniques work, we must build a background in traditional pattern recognition and machine learning concepts such as maximum likelihood decision theory, Bayesian estimation, nonparametric methods such as decision trees and support vector machines and temporal modeling approaches such as hidden Markov models. This course is designed to give you a strong background in fundamentals, yet also introduce you to the tools necessary to implement these algorithms.

Grading Policies:

Item
Weight
Exam No. 1 15%
Exam No. 2 15%
Exam No. 3 15%
Final Exam 15%
Computer Homework 40%
TOTAL: 100%


The course requirements include three in-class exams, a final exam and weekly programming projects. Students will be expected to select a dataset relevant to their research interests or to use one of the datasets available on the course web site (or from the instructor). Computer projects can be executed in any programming language. The assignment should be submitted via email as a pdf file.

Lecture Schedule:

We will cover the following topics:

  Class  
  Date  
  Topic(s)  
  Section(s)  
  Online Materials  
01
01/17
  Introduction: An Overview of Machine Learning     1.1 - 1.3  
slides | video
02
01/19
  Decision Theory: Bayes Rule     9.1, 9.2  
slides | video
03
01/22
  Decision Theory: Gaussian Classifiers     9.3, 9.4  
slides | video
04
01/24
  Decision Theory: Generalized Gaussian Classifiers     9.5 - 9.A  
slides | video
05
01/26
  Parameter Estimation: Maximum Likelihood     3.1  
slides | video
06
01/29
  Parameter Estimation: The Bayesian Approach     9.2  
slides | video
07
01/31
  Decision Theory: Discriminant Analysis     10.4  
slides | video
08
02/02
  Parameter Estimation: The Expectation Maximization Theorem     10.1  
slides | video
09
02/05
  Hidden Markov Models: Introduction     Notes  
slides | video
10
02/07
  Hidden Markov Models: Evaluation     Notes  
slides | video
11
02/09
  Hidden Markov Models: Decoding and Dynamic Programming     Notes  
slides | video
12
02/12
  Hidden Markov Models: Parameter Reestimation and Continuous Distributions     Notes  
slides | video
13
02/14
  Exam No. 1 (Lectures 01 - 09)     Notes  
exams
14
02/16
  Parameter Estimation: Information Theory Review     Notes  
slides | video
15
02/19
  Parameter Estimation: Discriminative Training     Notes  
slides | video
16
02/21
  Experimental Design: Foundations of Machine Learning     11.1 - 11.6  
slides | video
17
02/23
  Experimental Design: Statistical Significance and Confidence     Notes  
slides | video
18
02/26
  Parameter Estimation: Jacknifing, Boostrapping and Combining Classifiers     4.4 – 4.6, 7.1 - 7.5  
slides | video
19
02/28
  Parameter Estimation: Nonparametric Techniques     4.5, 4.6  
slides | video
20
03/01
  Unsupervised Learning: Clustering     10.2  
slides | video
21
03/11
  Unsupervised Learning: Hierarchical Clustering     10.3 - 10.5  
slides | video
22
03/13
  Supervised Learning: Decision Trees     2.3, 7.1 - 7.5  
slides | video
23
03/15
  Supervised Learning: Support Vector Machines     8.1 - 8.B  
slides | video
24
03/18
  Neural Networks: Introduction     6.1, 6.2, 6.A  
slides | video
25
03/20
  Neural Networks: Vanishing Gradients and Regularization     3.3, 5.3  
slides | video
26
03/22
  Exams: Exam No. 2 (Lectures 10-24)     Notes  
exams
27
03/25
  Neural Networks: Linear and Logistic Regression     3.1 - 3.A  
slides | video
28
03/27
  Neural Networks: Deep Learning     Notes  
slides | video
29
03/29
  Neural Networks: Alternative Architectures     11.1 - 11.3  
slides | video
30
04/01
  Neural Networks: Deep Generative Models and Autoencoders     10.3  
slides | video
31
04/03
  Neural Networks: Alternative Supervision Strategies     6.1, 6.2  
slides | video
32
04/05
  Neural Networks: Transfer Learning     Notes  
slides | video
33
04/08
  Neural Networks: Alternative Activation Functions and Optimizers     5.4 - 5.6  
slides | video
34
04/10
  Attention and Transformers     Notes  
slides | video
35
04/12
  More About Transformers     10.3 - 10.4  
slides | video
36
04/14
  Transformer Architectures     Notes  
slides | video
37
04/17
  Expainability in AI     Notes  
slides | video
38
04/19
  Trustworthiness in AI     Notes  
slides | video
39
04/22
  Exam No. 3 (Lectures 25 - 37)     Notes  
exams
40
04/24
  Applications: Human Language Technology (Sequential Decoding)     Notes  
slides | video
41
04/26
  Applications: Large Language Models and the ChatGPT Revolution     Notes  
slides | video
42
04/29
  Applications: Sampling Techniques and Quantum Computing     Notes  
slides | video
43
05/03
  Final Exam (08:00 AM - 10:00 AM)     N/A  
exams


Please note that the dates above for exams are fixed since they have been arranged to optimize a number of constraints. You need to adjust your schedules, including job interviews and site visits, accordingly.

Homework:

The homework schedule is as follows:

  HW  
  Date  
  Item(s)  
1
01/22
  Gaussian Distributions  
2
01/29
  Bayesian Decision Theory  
3
02/05
  ML and Bayesian Parameter Estimation  
4
02/12
  Gaussian Mixture Distribution Parameter Estimation  
5
02/19
  Dynamic Programming  
6
02/26
  Hidden Markov Models (HHMs)  
7
03/11
  Information Theory and Statistical Significance  
8
03/18
  LDA, K-Nearest Neighbors and K-Means Clustering  
9
03/25
  Bootstrapping, Bagging and Combining Classifiers  
10
04/01
  Nonparametric Classifiers, ROC Curves and AUC  
11
04/08
  Multilayer Perceptrons  
12
04/15
  Convolutional Neural Networks  
13
04/22
  Transformers  

The homework is due by 11:59 PM on the due date shown.