ITCS 5356: Introduction to Machine Learning
Fall 2024
Time and location: Tue, Thu 4:00 – 5:15pm, EPIC 3222
Instructor & TAs: |
|
Razvan Bunescu |
|
Youssef Ait Alama |
Office: |
|
Woodward 410G |
|
Burson 239B |
Office hours: |
|
Tue, Thu 5:30 – 6:30pm |
|
Mon, Wed 12:00 – 1:00pm |
Email: |
|
razvan.bunescu @ charlotte edu |
|
yaitalam @ charlotte edu |
Course description:
This course will introduce fundamental concepts and algorithms underlying the theory and practice of Machine Learning (ML). Major ML models and techniques that we aim to cover include: perceptron, k-nearest neighbors, linear regression, gradient descent, Naive Bayes, logistic regression, neural networks, and reinforcement learning. The descriptions of ML models will be supplemented with introductions of relevant foundational concepts in linear algebra, probability theory, and optimization.
Prerequisites:
Students are expected to be comfortable with programming in Python, data structures and algorithms, and have basic knowledge of mathematics. Review material will be made available on Canvas and on this website throughout the course.
Recommended free texts:
Pattern Recognition and Machine Learning by Christopher Bishop. Springer, 2007.
An Introduction to Statistical Learning with Python by James, Witten, Hastie, and Tibshirani. Springer, 2023.
Dive into Deep Learning by Zhang, Lipton, Li, and Smola. Amazon, 2019.
Mathematics for Machine Learning by Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong. Cambridge University Press, 2020.
Lecture notes:
- Syllabus & Introduction with the simple Perceptron
- Programming with Python
- Basic linear algebra
- NumPy for linear algebra
- k-Nearest Neighbors for classification and regression
- Differentiation and optimization
- Notes from lecture on Sep 24: one and two.
- Notes from lecture on Sep 26: one, two, and three.
- Symbolic differentiation with SymPy.
- Automatic differentiation with Autograd.
- Linear regression and ordinary least squares
- Gradient descent and least mean squares
- Curve fitting and regularization
- Notes from lecture on Oct 10: one and two.
- Notes from lecture on Oct 17: one and two.
- Notes from lecture on Oct 22: one and two.
- Kernel Perceptron and Averaged Perceptron
- Probability theory
- Maximum Likelihood Estimation principle
- Naive Bayes
- Logistic regression
- Multinomial softmax with temperature: notebook and pdf.
- Notes from lecture on Nov 12: one and two.
- Notes from lecture on Nov 14: one, two, three, and four.
- Notes from lecture on Nov 19: one and two.
- Computation graphs and automatic differentiation in PyTorch
- Multilayer Perceptrons, Backpropagation, and Deep Learning
- Reinforcement learning
Homework assignments1,2:
Final project:
Supplemental ML materials:
Background review materials:
- Python programming:
- Probability and statistics:
- Linear Algebra:
- Calculus:
Machine learning software: