ITCS 6156/8156: Machine Learning
Spring 2023

Time and location: Tue, Thu 4:00 – 5:15pm, Bioinformatics 105

Instructor & TAs:   Razvan Bunescu     Akarsh Pokkunuru
Office:   Woodward 210F   Zoom & Woodward 231
Office hours:   Mon, Fri 4:00 – 5:00pm   Tue, Thu 11:00 – 12:00pm
Email:   razvan.bunescu @ uncc edu   apokkunu @ uncc edu

Recommended free texts:
  • Pattern Recognition and Machine Learning by Christopher Bishop. Springer, 2007.
  • The Elements of Statistical Learning: Data Mining, Inference, and Prediction by T. Hastie, R. Tibshirani, & J. H. Friedman. Springer Verlag, 2009 - 2017
  • Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville. MIT Press, 2016.
  • Mathematics for Machine Learning by Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong. Cambridge University Press, 2020.

  • Recommended free course on Mathematics for Deep Learning and Data Science:
  • Linear Algebra module.
  • Calculus module.
  • Probability and Statistics module.

  • Course description:
    This course will introduce fundamental concepts, techniques, and algorithms underlying the theory and practice of machine learning (ML). Major ML models and techniques that we aim to cover include: perceptrons, linear regression, logistic regression, gradient descent, Support Vector Machines, k-nearest neighbors, decision tres, ensemble methods, k-Means clustering, neural networks, and generative adversarial networks. The description of the formal properties of the algorithms will be supplemented with motivating applications in a wide range of areas including natural language processing, computer vision, bioinformatics, or music analysis.

    Students are expected to be comfortable with programming in Python, data structures and algorithms (ITSC 2214), and have basic knowledge of linear algebra (MATH 2164), calculus, and statistics. Relevant background material will be made available on this website throughout the course.

    Lecture notes:
    1. Syllabus & Introduction with Perceptrons
    2. Linear algebra and optimization in NumPy and SciPy
    3. Linear regression, ridge regression, and Lasso
    4. Maximum Likelihood and Maximum A Posteriori estimation: Sections 2.1 and 2.2
    5. Gradient Descent algorithms
    6. Kernel methods:
    7. Logistic regression
    8. Machine learning and optimization in PyTorch
    9. Feed-Forward Neural Networks and Backpropagation
    10. Representation Learning I:
    11. Convolutional Neural Networks
    12. Recurrent Neural Networks and Attention
    13. Transformer: Self-Attention Networks
    14. Nearest Neighbor methods
    15. Decision Trees
    16. Ensemble Methods

    Homework assignments1,2:
    Paper presentation:
    Final project:
    Supplemental ML materials:
    Background reading materials:
    Machine learning software: