ITCS 6156/8156: Machine Learning
Spring 2023
Time and location: Tue, Thu 4:00 – 5:15pm, Bioinformatics 105
Instructor & TAs: |
|
Razvan Bunescu |
|
Akarsh Pokkunuru |
Office: |
|
Woodward 210F |
|
Zoom & Woodward 231 |
Office hours: |
|
Mon, Fri 4:00 – 5:00pm |
|
Tue, Thu 11:00 – 12:00pm |
Email: |
|
razvan.bunescu @ uncc edu |
|
apokkunu @ uncc edu |
Recommended free texts:
Pattern Recognition and Machine Learning by Christopher Bishop. Springer, 2007.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction by T. Hastie, R. Tibshirani, & J. H. Friedman. Springer Verlag, 2009 - 2017
Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville. MIT Press, 2016.
Mathematics for Machine Learning by Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong. Cambridge University Press, 2020.
Recommended free course on Mathematics for Deep Learning and Data Science:
Linear Algebra module.
Calculus module.
Probability and Statistics module.
Course description:
This course will introduce fundamental concepts, techniques, and algorithms underlying the theory and practice of machine learning (ML). Major ML models and techniques that we aim to cover include: perceptrons, linear regression, logistic regression, gradient descent, Support Vector Machines, k-nearest neighbors, decision tres, ensemble methods, k-Means clustering, neural networks, and generative adversarial networks. The description of the formal properties of the algorithms will be supplemented with motivating applications in a wide range of areas including natural language processing, computer vision, bioinformatics, or music analysis.
Prerequisites:
Students are expected to be comfortable with programming in Python, data structures and algorithms (ITSC 2214), and have basic knowledge of linear algebra (MATH 2164), calculus, and statistics. Relevant background material will be made available on this website throughout the course.
Lecture notes:
- Syllabus & Introduction with Perceptrons
- Linear algebra and optimization in NumPy and SciPy
- Linear regression, ridge regression, and Lasso
- Maximum Likelihood and Maximum A Posteriori estimation: Sections 2.1 and 2.2
- Gradient Descent algorithms
- Kernel methods:
- Logistic regression
- Machine learning and optimization in PyTorch
- Feed-Forward Neural Networks and Backpropagation
- Representation Learning I:
- Convolutional Neural Networks
- Recurrent Neural Networks and Attention
- Transformer: Self-Attention Networks
- Nearest Neighbor methods
- Decision Trees
- Ensemble Methods
- Boosted decistion trees and Random Forests
Homework assignments1,2:
- Assignment 0 on background material:
- Assignment 1 on Perceptrons:
- Assignment 2 on Linear regression (normal equations):
- Assignment 3 on Linear regression (gradient descent):
- Assignment 4 on Kernel Perceptrons and SVMs.
- Assignment 5 on Logistic regression in NumPy and PyTorch.
- Assignment 6 on Neural Networks in NumPy and PyTorch.
- Assignment 7 on PCA and Auto-Encoders
- Assignment 8 on CNNs for digit classification
- Assignment 9 on RNNs for sentiment analysis
Paper presentation:
Final project:
Supplemental ML materials:
Background reading materials:
- Python programming:
- Probability and statistics:
- Linear Algebra:
- Calculus:
Machine learning software: