Discrete Optimization Talks (DOTs)

DOTs are virtual discrete optimization talks, organized by Aleksandr M. Kazachkov and Elias B. Khalil. To receive updates (and the Zoom link) for upcoming DOTs, please join our mailing list. If you would like to give a DOT, please fill out this form or email us.

Topics of interest include theoretical, computational, and applied aspects of integer and combinatorial optimization.

The format is two 30-minute talks each session. From January to April 2022, the seminars are scheduled for the last Friday of every month at 1:00 p.m. ET.

A special feature of DOTs is a social component. After a usual talk, you might grab a tea/coffee and chat with other attendees. Why not here too? Join us for some informal discussion after each DOT, as well as throughout the week on our Discord channel.

Videos of past DOTs are posted under Past Talks and can also be found on our YouTube Channel.

Join the mailing list to receive information about our upcoming seminars and how to virtually participate. Joining this list is necessary to receive the password for each DOT (usually sent on the Monday of the week in which there are DOTs). You may also wish to subscribe to our Google calendar (also available in ics format).

Upcoming Talks

Swati Gupta
Georgia Tech

Reusing Combinatorial Structure for Projections over Submodular Polytopes

Abstract: Optimization algorithms such as projected Newton's method, FISTA, mirror descent and its variants enjoy near-optimal regret bounds and convergence rates, but suffer from a computational bottleneck of computing “projections” in potentially each iteration (e.g., O(T^{1/2}) regret of online mirror descent). On the other hand, conditional gradient variants solve a linear optimization in each iteration, but result in suboptimal rates (e.g., O(T^{3/4}) regret of online Frank-Wolfe). Motivated by this trade-off in runtime v/s convergence rates, we consider iterative projections of close-by points over widely-prevalent submodular base polytopes B(f). We develop a toolkit to speed up the computation of projections using both discrete and continuous perspectives. We subsequently adapt the away-step Frank-Wolfe algorithm to use this information and enable early termination. For the special case of cardinality based submodular polytopes, we improve the runtime of computing certain Bregman projections by a factor of \Omega(n/log(n)). Our theoretical results show orders of magnitude reduction in runtime in preliminary computational experiments.

Bio: Dr. Swati Gupta is an Assistant Professor and Fouts Family Early Career Professor in the H. Milton Stewart School of Industrial & Systems Engineering at Georgia Institute of Technology. She is the lead of Ethical AI at the multi-institution NSF AI Institute on AI4OPT (ai4opt.org). She received a Ph.D. in Operations Research from MIT in 2017, and a joint bachelor’s and master’s in Computer Science from the Indian Institute of Technology, Delhi in 2011. Dr. Gupta’s research interests are in optimization, machine learning, and algorithmic fairness. Her work spans various application domains such as revenue management, healthcare, energy, and quantum computation. She received the JP Morgan Chase Early Career Faculty Award in 2021, Student Recognition of Excellence in Teaching: Class of 1934 CIOS Honor Roll in 2021 and in 2022, and NSF CISE Research Initiation Initiative (CRII) Award in 2019. She was also awarded the prestigious Simons-Berkeley Research Fellowship in 2017-2018, where she was selected as the Microsoft Research Fellow in 2018. Dr. Gupta received the Google Women in Engineering Award in India in 2011. Her research is partially funded by the National Science Foundation and DARPA.

Emma Frejinger
Université de Montréal

Fast Heuristic L-Shaped Method Through Machine Learning

Abstract: We propose a methodology at the nexus of operations research and machine learning (ML) leveraging generic approximators available from ML to accelerate the solution of mixed-integer linear two-stage stochastic programming. We aim at solving problems where the second stage is highly demanding. Our core idea is to gain large reductions in online solution time while incurring small reductions in first-stage solution accuracy by substituting the exact second-stage solutions with fast, yet accurate supervised ML predictions. This upfront investment in ML would be justified when similar problems are solved repeatedly over time, for example, in operational transport planning related to fleet management, routing and container yard management.

Our experimental results focus on the problem class seminally addressed with the integer and continuous L-shaped cuts. Our extensive empirical analysis is grounded in standardized families of problems derived from stochastic server location (SSLP) and multi knapsack (SMKP) problems available in the existing literature. The proposed method can solve the hardest instances of SSLP in less than 9% of the time it takes the state-of-the-art exact method, and in the case of SMKP the same figure is 20%. Average optimality gaps are in most cases less than 0.1%.

Bio: Emma Frejinger, Associate Professor at the Department of Computer Science and Operations Research at Université de Montréal, holds the Canada Research Chair in Demand Forecasting and Optimization of Transport Systems, and the CN Chair in Optimization of Railway Operations. She has been involved in numerous projects with public and private actors solving decision-making problems using methodologies from the fields of statistical learning and operations research. She is a member of CIRRELT, an associate member of Mila and she works part-time as a scientific advisor for IVADO Labs.