Home     MCADS Lab     Research     Publications     Activities     Codes     Data     Teaching

CS 7180: Special Topics in AI: High-Dimensional Data Analysis via Parsimonious Modeling and Probabilistic Graphical Models


  • Instructor: Prof. Ehsan Elhamifar

  • Office Hours: By appointment, 310E WVH

  • Class: Tuesdays and Fridays 1:35pm—3:15pm, Snell Library 115

  • Discussions, Lectures, Homeworks on Piazza


The increasing amounts of high-dimensional data across different science and engineering disciplines requires efficient algorithms for uncovering the structure in data as well as inference, learning and decision making. The course covers state-of-the-art algorithms for modeling and analysis of high-dimensional datasets with low-dimensional structures. The first part of the course will cover topics in parsimonious modeling, including, algorithm and theory of sparse and low-rank representation and recovery along with their applications in supervised and unsupervised learning problems, including classification, clustering, subset selection and dictionary learning. The second part of the course will cover topics in probabilistic graphical models, including, learning and inference (variable elimination, message passing, sampling, dual decomposition, variational methods) in Bayesian Networks and Markov Random Fields. The course will also cover applications of the methods in computer vision, robotics, speech and language processing and healthcare.


Machine Learning, Introduction to Probability and Linear Algebra.

  1. Probabilistic Graphical Models (PGMs) for High-Dimensional Data Analysis

    • Representation and Independencies: Directed PGMs / Bayesian Networks, Undirected PGMs / Markov Random Fields, Conditional Random Fields, Template-Based Models and Gaussian Networks

    • Inference: Exact and Approximate Inference: Variable Elimination, Clique-Tree, Message Passing, Dual Decomposition, Variational Techniques, (Loopy) Belief Propagation, Particle-based Sampling and MCMC

    • Learning: Learning as Optimization, MLE and MAP, Structure Learning

  2. Parsimonious Modeling for High-Dimensional Data Analysis

    • Applications: Matrix Completion and Factorization, Robust Principal Component Analysis, Subspace Clustering, Subset Selection and Summarization, Dictionary Learning

    • Algorithms: Convex Algorithms (Proximal Methods, Alternating Direction Methods, Non-smooth Convex Optimization), Greedy Algorithms (Matching Pursuit, Orthogonal Matching Pursuit, Subspace Pursuit), Homotopy Algorithms

    • Theory: Properties of Norms in High-Dimensions, Coherence and Null-Space Property, Restricted Isometry Property (RIP) and Rank-RIP


  • Homeworks (45%): include analytical and programming questions

  • Final project (40%)

  • Class Participation and Presentations (15%)


  • [ME] Michael Elad, Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing, Springer, 2010.

  • [KF] Daphne Koller and Nir Friedman, Probabilistic Graphical Models, MIT Press, 2009


    Lecture 1: Introduction to PGMs and Parsimonious Modeling

    • See lecture slides on piazza.

    Lecture 2: Bayesian Networks (BNs), Representation, Conditional Independencies

    • Chapter 3 from KF book.

    Lecture 3: Markov Random Fields (MRFs), Representation, Conditional Independencies, Converting BNs to MRFs

    • Chapter 4 from KF book.

    Lecture 4: Log-Linear Models, Metric MRFs, Inference in PGMs, Variable Elimination Algorithm

    • Chapters 4 and 9 from KF book.

    Lecture 5: Variable Elimination Algorithm, Message Passing Algorithm, Cluster Graphs

    • Chapters 9 and 10 from KF book.

    Lecture 6: Cluster Graphs, Belief Propagation Algorithm, Cluster (Junction) Trees

    • Chapter 10 from KF book.

    Lecture 7: Cluster (Junction) Trees, Conditional Independencies, Loopy Belief Propagation, Calibration, Scheduling and Convergence

    • Chapter 10 from KF book.

    Lecture 8: MAP Estimation, Max Product Variable Elimination, Max Product Belief Propagation, Factor Graphs

    • Chapter 13 from KF book.

    Lecture 9: Particle-Based Sampling: Forward Sampling in Bayesian Nets, Conditional Probability Queries, Sampling Bounds, Rejection Sampling

    • Chapter 12 from KF book.

    Lecture 10: Particle-Based Sampling: Markov Chain, Invariant Distribution, Regularity, Markov Chain Monte Carlo (MCMC) Sampling

    • Chapter 12 from KF book.

    Lecture 11: Particle-Based Sampling: Metropolis-Hastings Algorithm, Gibbs Sampling

    • Chapter 12 from KF book.

    Lecture 12: Learning in PGMs: Density Estimation, Prediction, Bias-Variance Trade-off, MLE, Sufficient Statistics

    • Chapters 16 and 17 from KF book.

    Lecture 13: Learning in PGMs: MLE in BNs and MRFs, MLE in Log-Linear Models

    • Chapters 16 and 17 from KF book.

    Lecture 14: PGM Paper Presentations

    • See Piazza for Detailed Information.

    Lecture 15: PGM Paper Presentations

    • See Piazza for Detailed Information.

    Lecture 16: PGM Paper Presentations

    • See Piazza for Detailed Information.

    Lecture 17: PGM Paper Presentation, Introduction to Sparse Recovery

    • See Piazza for Detailed Information.

    Lecture 18: Sparse Recovery Problem: Formulation, Applications, Greedy Methods, Optimization-Based Methods

    Lecture 19: Sparse Recovery Algorithm: L0 minimization, Convex Envelope, L1 Relaxation, Geometric Properties of Lp-norm Recovery

    Lecture 20: Sparse Recovery Theory: Uniqueness, Exact Recovery via L1, Spark and Coherence of Dictionaries

    Lecture 21: Sparse Recovery Extensions and Applications: Dealing with Corrupted Data, Sparse Dictionary Learning, Method of Optimal Directions, KSVD

    Lecture 22: Affine Rank Minimization Problem: Formulation, Application Examples, Relationships to Sparse Recovery

    Lecture 23: Affine Rank Minimization Theory: Uniqueness, Exact Recovery, Rank-RIPS, Matrix Completion



All students in the course are subject to the Northeastern University's Academic Integrity Policy. Any submitted report/homework/project by a student in this course for academic credit should be the student's own work. Collaborations are only allowed if explicitly permitted. Per CCIS policy, violations of the rules, including cheating, fabrication and plagiarism, will be reported to the Office of Student Conduct and Conflict Resolution (OSCCR). This may result in deferred suspension, suspension, or expulsion from the university.