ITCS 4156: Introduction to Machine Learning
Spring 2021


Time and Location: Wed 5:30 – 8:15pm, Online
Instructor: Razvan Bunescu
Office: Woodward Hall 210F
Office Hours: Tue, Thu 4:00 – 5:00pm on Zoom, or by email appointment
Email: rbunescu @ uncc edu

Teaching Assistant: Priyanka Jadhav (pjadhav7 @ uncc edu)
Office Hours: Mon, Wed 2:00 – 3:00 pm on Zoom, or by email appointment

Textbook: There is no required textbook for this class. Slides and supplementary materials will be made available on the course website.

Supplementary Texts:
  • Pattern Recognition and Machine Learning by Christopher Bishop. Springer, 2007.
  • The Elements of Statistical Learning: Data Mining, Inference, and Prediction by T. Hastie, R. Tibshirani, & J. H. Friedman. Springer Verlag, 2009 - 2017
  • Reinforcement Learning: An Introduction by Richard S. Sutton and Andrew G. Barto. MIT Press, 2018
  • Machine Learning: The Art and Science of Algorithms that Make Sense of Data by Peter Flach, Cambridge University Press, 2012
  • A Course in Machine Learning by Hal Daume III
  • Machine Learning by Tom Mitchell. McGraw Hill, 1997

  • Course description:
    This course will give an overview of the main concepts, techniques, and algorithms underlying the theory and practice of machine learning. The course will cover the fundamental topics of regression, classification, clustering, and reinforcement learning, and a number of corresponding learning models such as linear regression, perceptrons, logistic regression, neural networks, Naive Bayes, nearest neighbors, decision tres, k-Means, and Q-learning. The description of the formal properties of the algorithms will be supplemented with motivating applications in a wide range of areas including natural language processing, computer vision, bioinformatics, and music analysis.

    Prerequisites:
    The students are expected to be comfortable with programming and familiar with basic concepts in linear algebra, calculus, and statistics. Relevant background material is made available on the course website below.

    Lecture notes1:
    1. Syllabus & Introduction
    2. Linear regression, Ridge regression, and Lasso
    3. Linear algebra and optimization in NumPy and SciPy
    4. Gradient Descent algorithms
    5. Perceptrons and Kernels
    6. Support Vector Machines
    7. Logistic regression, Maximum likelihood, and Maximum entropy
    8. Machine learning and optimization in PyTorch
    9. Neural Networks
    10. Naive Bayes
    11. Nearest Neighbor methods
    12. Decision Trees and Random Forests
    13. Clustering

    Homework assignments2:
    1. Linear regression (normal equations):
    2. Linear regression (gradient descent)
    3. Perceptron
    4. Kernel Perceptron
    5. Text Classification with Perceptrons
    6. Logistic regression in NumPy
    7. Logistic Regression in PyTorch
    8. Neural networks in PyTorch
    9. ML theory exercises
    2 The spam classification exercise and the flower and spiral datasets are adapted from homework assignments developed by Andrew Ng.


    Background reading materials:
    Machine learning software: