Graduate students from UIC, NU, TTIC, UC, and IIT may request to take these courses and receive credit at their home institutions (or to simply audit these courses). Note that credits received for taking these courses might not correspond exactly with the listed course and will be determined in consultation with the IDEAL site director of the graduate student’s home institution. Each course will run according to the schedule of the respective university offering the course. You may request to register for courses outside your home institution by filling out this form.
[To facilitate receiving credit at your home institution, please fill the following form.]
Academic calendars at the participating universities:
University of Illinois at Chicago: [Fall] Aug. 26 – Dec. 6, exams Dec. 9 – Dec. 13
Northwestern University: [Fall] Sep. 24 – Dec. 7, exams Dec. 9 – Dec. 14
Toyota Technological Institute at Chicago: [Autumn] Sep. 30 – Dec. 6, exams Dec. 10 – Dec. 14
University of Chicago: [Autumn] Sep. 30 – Dec. 6, exams Dec. 10 – Dec. 14
Illinois Institute of Technology: [Fall] Aug. 19 – Nov. 30, exams Dec. 2 – Dec. 7
Loyola University Chicago: [Fall] Aug. 26 – Dec. 7, exams Dec. 9 – Dec. 14
University of Illinois at Chicago (UIC)
CS 520 Causal Inference and Learning (in-person)
Elena Zheleva Taft Hall 216
TR 11:00 am –12:15 pm
Causal reasoning, structural causal models, interventions and counterfactuals, identification, mediation, attribution, dealing with confounding, selection, and interference bias.
ECE 415 Image Analysis and Computer Vision (online)
Ahmet Enis Cetin [Contact aecyy@uic.edu]
TR 12:30–1:45 pm
Image analysis techniques, 2D and 3D shape representation, segmentation, camera and stereo modeling, motion, generic object and face recognition, parallel and neural architectures for image and visual processing.
ECE 508 Convex Optimization (in person)
Shuo Han Taft Hall 215
MW 4:30–5:45 pm
This graduate-level course covers three main aspects of convex optimization: theory, applications (e.g., machine learning, signal/image processing, controls), and algorithms. The course will roughly follow the book by Boyd and Vandenberghe.
ECE 534 Elements of Information Theory (in-person)
Natasha Devroye Burnham Hall B10
TR 11:00 am–12:15 pm
Entropy and mutual information, fundamentals of coding theory, data compression, complexity of sources, channel mutual information and capacity, rate distortion theory, information theory applications.
ECE/CS 559 Neural Networks (in person or online)
Mesrob I. Ohannessian Lecture Center F4
TR 9:30–10:45 am
Mathematical neuron models, learning methods, the perceptron, basic nonlinear optimization, backpropagation algorithm, associative memory, Hopfield networks, SVM, vector quantization, SOM, PCA, convolutional networks, deep learning. Prerequisite: Calculus, linear algebra, and computer programming.
MCS 549 Mathematical Foundations of Data Science (in person)
Lev Reyzin Lincoln Hall 206
MWF 9:00–9:50 am
This course covers the mathematical foundations of modern data science from a theoretical computer science perspective. Topics will include random graphs, small world phenomena, random walks, Markov chains, streaming algorithms, clustering, graphical models, singular value decomposition, and random projections.
Northwestern University (NU)
COMP_SCI 436 Graduate Algorithms (in person)
Konstantin Makarychev Tech L150
TR 9:30 am –10:50 am
This is an introductory graduate level course on Algorithms that will give broad exposure to recent advances in algorithms, yet cover the fundamental techniques needed to understand the recent advances in algorithms research. At the end of the course, students will be able to read and understand research papers in most recent areas of algorithms research. The pre-requisites for the course includes the undergraduate Algorithms course (COMP_SCI 336 or equivalent) and COMP_SCI 212 or equivalent. Some familiarity with linear algebra is useful, but not necessary strictly.
COMP_SCI 437 Approximation Algorithms (in person)
Konstantin Makarychev Tech L150
TR 11:00 am –12:20 pm
This course studies approximation algorithms – algorithms that are used for solving hard optimization problems. Such algorithms find approximate (slightly suboptimal) solutions to optimization problems in polynomial time. Unlike heuristics, approximation algorithms have provable performance guarantees: they have bounds on the running time and on the quality of the obtained solutions. In this course, we will introduce various algorithmic techniques used for solving optimization problems such as greedy algorithms, local search, dynamic programming, linear programming (LP), semidefinite programming (SDP), LP duality, randomized rounding, and primal-dual analysis. The course assumes background in basic probability theory and discrete mathematics. Key mathematical concepts will be reviewed before they are used.
COMP_SCI 462 Foundations of Quantum Computing and Information (in person) Vijayaraghavan/Smith TBD
MW 9:30–10:50 am
This course will be an introduction to the theory of quantum computation from a computer science perspective. Quantum computing holds great promise for obtaining substantial computational improvements over classical computing for many problems. In this course, we will cover the basics of quantum computation, and different topics that explore both the capabilities and limitations of quantum computers. Topics will include (subject to change) the basics of quantum information, quantum circuits, quantum algorithms, quantum complexity theory, quantum query complexity, and quantum communication complexity. No knowledge of quantum mechanics is required. We will cover the necessary physics concepts that are.
COMP_SCI 497 Calibration (Foundations of Trustworthy ML) (in person) Jason Hartline TBD
M or T 2:00–5:00 pm [Contact hartline@eecs.northwestern.edu for finalized day.] This is an advanced topics seminar that will consider theoretical topics related to calibration. Calibrated predictions are predictions that are empirically correct. For example, the weather forecast is calibrated if for each predicted probability of rain p, when the prediction is p chance of rain, the empirical fraction of times that it rains is also p. Calibrated predictions have the property that it is optimal for a decision maker to optimize assuming that the prediction is correct. Calibrated predictions also have applications to explainable AI and fairness. For these reasons, there has been a considerable and rich recent literature developing the algorithmic foundations of calibration. The readings of the course will be drawn from the recent and classic literature pertaining to calibration. Topics include: online learning and swap regret, prediction for decision making, measuring calibration error, online calibration, calibration and machine learning, multi calibration, fairness, omni-prediction, correlated equilibrium, manipulation of learning algorithms, and calibration for language models. https://sites.northwestern.edu/hartline/cs-497-calibration/
STAT 430-1 Probability for Statistical Inference 1 (in-person) Miklos Racz Lunt Hall 103
TR 2:00–3:20 pm
First course in graduate probability theory. Foundations of measure theoretic probability, with applications to statistics.
Toyota Technological Institute at Chicago (TTIC)
TTIC 31150/CMSC 31150 Mathematical Toolkit (in-person)
Avrim Blum TTIC 530
MW 3:00–4:20 pm
The course is aimed at first-year graduate students and advanced undergraduates. The goal of the course is to collect and present important mathematical tools used in different areas of computer science. The course will mostly focus on linear algebra and probability. We intend to cover the following topics and examples: Abstract linear algebra: vector spaces, linear transformations, Hilbert spaces, inner product, Gram-Schmidt orthogonalization, eigenvalues and eigenvectors, Singular Value Decomposition, SVD applications. Discrete probability: events and random variables, Markov, Chebyshev and Chernoff-Hoeffding bounds. Balls and bins problems. Threshold phenomena in random graphs. Randomized algorithms (e.g., polynomial identity testing, perfect matchings, low-congestion routing). Gaussian variables, concentration inequalities, dimension reduction. Additional topics (to be chosen from based on time and interest): Martingales, Markov Chains, Random Matrices.
University of Chicago (UC)
CMSC 25300/35300 Mathematical Foundations of Machine Learning (in-person) Li/Willette/Maire Kent Chem Lab 107
MW 1:30–2:50 pm
This course is an introduction to key mathematical concepts at the heart of machine learning. The focus is on matrix methods and statistical models and features real-world applications ranging from classification and clustering to denoising and recommender systems. Mathematical topics covered include linear equations, regression, regularization, the singular value decomposition, iterative optimization algorithms, and probabilistic models. Machine learning topics include the LASSO, support vector machines, kernel methods, clustering, dictionary learning, neural networks, and deep learning. Students are expected to have taken a course in calculus and have exposure to numerical computing (e.g. Matlab, Python, Julia, or R). Knowledge of linear algebra and statistics is not assumed. Appropriate for graduate students or advanced undergraduates. This course could be used a precursor to TTIC 31020, “Introduction to Machine Learning” or CSMC 35400.
CMSC 25460 Introduction to Optimization (in-person)
Orecchia TBD
TR 9:30–10:50 pm
Convex optimization algorithms and their applications to efficiently solving fundamental computational problems. Topics include modeling using mathematical programs, gradient descent algorithms, linear programming, Lagrangian duality, basics of complexity theory for optimization.
Illinois Institute of Technology (IIT)
CS 484 Introduction to Machine Learning and Deep Learning (hybrid) Binghui Wang John T. Rettaliata Engineering Center 104
MR 10:00-11:15 am
An introduction to machine learning concepts and algorithms, including classification, clustering, and regression. Topics include k-means clustering, nearest neighbors classification, decision trees, naive Bayes, logistic regression, support vector machines, and neural networks. Special focus will be on practical aspects of machine learning, including data preparation, experimental design, and modern tools for building machine learning systems. Basic probability theory knowledge is required.
Loyola University Chicago (LUC)
QUIN 499 Introduction to Applied AI (in-person)
Steven Keith Platt Schreiber Center 302
TR 6:00-9:00 pm
Introduction to Artificial Intelligence (AI) with a focus on applied machine learning and generative AI. The curriculum covers core math concepts and algorithms, as well as the statistical tools essential for building machine learning and generative AI applications.