| Lecture | MWF: 10:00 - 10:50 AM (ENGR 308 / Online) |
| Lecturer |
Joseph Picone, Professor Office: ENGR 718 Office Hours: (MWF) 08:00 AM - 10:00 AM Phone: 215-204-4841 (desk); 708-848-2846 (cell - preferred) Email: picone@temple.edu Zoom: picone@temple.edu or joseph.picone@gmail.com |
| Communication: tu_ece8527@listserv.temple.edu | |
| Website | http://www.isip.piconepress.com/courses/temple/ece_8527 |
| Required Textbooks |
A. Lindholm, N. Wahlstrom, F. Lindsten and T. Schon, Machine Learning: A First Course for Engineering and Scientists, Cambridge University Press, New York, New York, USA, ISBN: 978-1-108-84360-7, pp. 338, 2022. URL: http://smlbook.org/book/sml-book-draft-latest.pdf I. Drori, The Science of Deep Learning, Cambridge University Press, New York, New York, USA, ISBN: 978-1-108-83508-4, pp. 339, 2023. URL: https://www.thescienceofdeeplearning.org/ |
| Reference Textbooks |
There are a lot of textbooks available online. Most modern textbooks focus on neural networks or deep learning. While these are important topics, our goal in this course is to give you a broad perspective of the field. Technology changes quickly, so you need to have a good background in the fundamentals, and need to understand how to do more than just "button push."
Those wishing to build their theoretical background in
this area will find this book useful:
|
| Prerequisites | ECE 8527/5110: ENGR 5022 (minimum grade: B-) ENGR 5033 (minimum grade: B-) ECE 4527: ECE 3512 (minimum grade: C-) ECE 3522 (minimum grade: C-) |
Pattern recognition theory and practice is concerned with the
design, analysis, and development of methods for the
classification or description of patterns, objects, signals, and
processes. At the heart of this discipline is our ability infer
the statistical behavior of data from limited data sets, and to
assign data to classes based on generalized notions of distances
in a probabilistic space. Many commercial applications of pattern
recognition exist today, including voice recognition (e.g., Amazon
Alexa), fingerprint classification (e.g., MacBook Pro touch bar),
and retinal scanners (e.g., your favorite cheesy sci-fi movie).
Machine learning is a field that is at least 50 years old. Recent
advances in deep learning, starting around 2005, have
have revolutionized the field. Today, machine learning is one
of the most active areas of engineering and is enjoying
unprecedented levels of success. However, to understand why
these techniques work, we must build a background in traditional
pattern recognition and machine learning concepts such as
maximum likelihood decision theory, Bayesian estimation,
nonparametric methods such as decision trees and support
vector machines and temporal modeling approaches such as
hidden Markov models. This course is designed to give you
a strong background in fundamentals, yet also introduce you
to the tools necessary to implement these algorithms.
| |
|
| Exam No. 1 | 10% |
| Exam No. 2 | 10% |
| Exam No. 3 | 10% |
| Final Exam | 30% |
| Computer Homework | 40% |
| TOTAL: | 100% |
| |
|
|
|
|
| |
|
Introduction: An Overview of Machine Learning | (LWLS) 1.1 - 1.3 |
|
| |
|
What Problem Are We Trying To Solve? | (LWLS) 1.1 - 1.3 |
|
| |
|
Machine Learning Demonstrations | IMLD |
|
| |
|
Decision Theory: Bayes Rule | (DHS) 2.1 - 2.12 |
|
| |
|
Decision Theory: Two-Category Classification | (LWLS) 9.1, 9.2 |
|
| |
|
Applications of Bayesian Decision Theory | (LWLS) 9.3 - 9.6 |
|
| |
|
Maximum Likelihood Parameter Estimation | (DHS) 3.1, 3.2 |
|
| |
|
Bayesian Parameter Estimation | (DHS) 3.1 - 3.3 |
|
| |
|
Discriminant Analysis | (LWLS) 10.1, (DHS) 3.6 - 3.10 |
|
| |
|
The Expectation Maximization (EM) Theorem | (FJ) 2.7 |
|
| |
|
Hidden Markov Models: Introduction | (JD) 12.1-12.5 |
|
| |
|
Hidden Markov Models: Evaluation | (RJ) 6.1 - 6.16 |
|
| |
|
Exam No. 1 (Lectures 01 - 11) | Notes |
|
| |
|
Review: Exam No. 1 | Notes |
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
Exams: Exam No. 2 (Lectures 10-24) | Notes |
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
Exam No. 3 (Lectures 25 - 37) | Notes |
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
TBD |
|
|
| |
|
Final Exam (08:00 AM - 10:00 AM) | N/A |
|
| |
|
|
| |
|
Gaussian Distributions |
| |
|
Bayesian Decision Theory |
| |
|
Probability of Error Computations |
| |
|
ML vs. Bayesian Parameter Estimation |
| |
|
Dynamic Programming |
| |
|
Hidden Markov Models (HHMs) |
| |
|
Information Theory and Statistical Significance |
| |
|
LDA, K-Nearest Neighbors and K-Means Clustering |
| |
|
Bootstrapping, Bagging and Combining Classifiers |
| |
|
Nonparametric Classifiers, ROC Curves and AUC |
| |
|
Multilayer Perceptrons |
| |
|
Convolutional Neural Networks |
| |
|
Transformers |