Home     MCADS Lab     Research     Publications     Activities     Codes     Teaching


CS 6140: Machine Learning


GENERAL INFORMATION

  • Instructor: Prof. Ehsan Elhamifar
  • Instructor Office Hours: Mondays 4:30pm—5:30pm, 310E WVH
  • Class: Mondays and Wednesday 14:50—16:30, Behrakis Health Sciences Cntr 320
  • TA1: Aanchal Anil Samdariya (samdariya.a [at] husky.neu.edu), Office Hours: Tuesdays, 11am-12pm, 132H Nightingale Hall
  • TA2: Jieyu Sheng (sheng.j [at] husky.neu.edu), Office Hours: Fridays, 11am-12pm, 132H Nightingale Hall
  • Discussions, Lectures, Homeworks on Piazza
  • DESCRIPTION

    This course covers practical algorithms and the theory for machine learning from a variety of perspectives. Topics include supervised learning (generative/discriminative learning, parametric/non-parametric learning, deep neural networks, support vector machines) and unsupervised learning (clustering, dimensionality reduction). The course will also discuss recent applications of machine learning in computer vision, data mining, natural language processing and robotics.

    PREREQUISITES

    Introduction to Probability and Statistics, Linear Algebra, Algorithms.

    SYLLABUS
    1. Supervised Learning

      • Linear regression, overfitting, regularization, sparsity

      • Logistic regression

      • Naive Bayes

      • Neural networks and deep learning: DNNs, CNNs, RNNs

      • SVM, Perceptron and kernels

      • Decision trees and instance-based learning

    2. Unsupervised Learning

      • Clustering: k-means, spectral clustering

      • Dimensionality reduction: PCA, Kernel PCA, Autoencoders

      • Expectation Maximization

    GRADING

    Homeworks are due at the beginning of the class on the specified dates. No late homeworks or projects will be accepted.

    • Homeworks: 4 HWs (40%)

    • Project (30%)

    • Final Exam (30%)

    Homework consist of both analytical questions and programming assignments. Collaboration on HWs is not allowed, unless specified. Programming assignments must be done via Python. Both codes and results of running codes on data must be submitted.

    Exams consist of analytical questions from topics covered in the class. Students are allowed to bring a single cheat sheet to the exam.

    TEXTBOOKS

    • [CB] Christopher Bishop, Pattern recognition and machine learning. [Required]

    • [KM] Kevin P. Murphy, Machine Learning: A Probabilistic Perspective. [Optional]

    • [KF] Daphne Koller and Nir Friedman, Probabilistic Graphical Models. [Optional]

    READINGS

    Lecture 1: Introduction to ML, Linear Algebra Review

    Lecture 2: Introduction to Regression, Convex Functions and Optimality

    • Chapter 3 from CB book.

    Lecture 3: Linear Regression: Closed-form Solution, Gradient Descent and SGD, Basis Function Expansion

    • Chapter 3 from CB book.

    Lecture 4: Robust Regression, Overfitting, Regularization

    • Chapter 3 from CB book.

    Lecture 5: Hyper-parameter Tuning, Cross Validation, Probability Review

    Lecture 6: Maximum Likelihood Estimation, Maximum A Posteriori (MAP) Estimation

    • Chapter 2 and 3 from CB book.

    Lecture 7: Classification, Logistic Regression, Parameter Learning via Maximum Likelihood

    • Chapter 4.3 from CB book.

    Lecture 8: Softmax Regression, Overfitting, Discriminate vs Generative Modeling, Generative Classification

    • Chapter 4.2 from CB book.

    Lecture 9: Generative Classification, Naive Bayes

    • Chapter 4.2 from CB book.

    Lecture 10: Convex Optimization, Lagrangian Function, KKT Conditions

    • See lecture notes on piazza.

    Lecture 11: Suport Vector Machines: Vanilla SVM, Dual SVM

    • Chapter 7 from CB book.

    Lecture 12: Project pitch

    Lecture 13: Suport Vector Machines: Soft-Margin SVM, Kernel SVM, Multi-Class SVM

    • Chapter 7 from CB book.

    Lecture 14: Dimensionality Reduction: Principal Component Analysis

    • Chapter 12 from CB book.

    Lecture 15: Neural Networks

    • Chapter 5 from CB book.

    Lecture 16: Neural Networks: Training, Forward and Back Propagation

    • Chapter 5 from CB book.

    ADDITIONAL RESOURCES

    ETHICS

    All students in the course are subject to the Northeastern University's Academic Integrity Policy. Any submitted report/homework/project by a student in this course for academic credit should be the student's own work. Collaborations are only allowed if explicitly permitted. Per CCIS policy, violations of the rules, including cheating, fabrication and plagiarism, will be reported to the Office of Student Conduct and Conflict Resolution (OSCCR). This may result in deferred suspension, suspension, or expulsion from the university.