Discrete Optimization Talks (DOTs)
Discrete Optimization Talks (DOTs) is a virtual seminar series from the Mixed Integer Programming Society (MIPS), a technical section of the Mathematical Optimization Society.
DOTs are virtual discrete optimization talks, organized by Aleksandr M. Kazachkov and Elias B. Khalil. To receive updates (and the Zoom link) for upcoming DOTs, please join our mailing list. If you would like to give a DOT, please fill out this form or email us.
Topics of interest include theoretical, computational, and applied aspects of integer and combinatorial optimization.
The format is two 30-minute talks each session. From October to December 2023, the seminars are scheduled the first Friday of every month at 12:00 p.m. ET.
A special feature of DOTs is a social component. After a usual talk, you might grab a tea/coffee and chat with other attendees. Why not here too? Join us for some informal discussion after each DOT, as well as throughout the week on our Discord channel.
Videos of past DOTs are posted under Past Talks and can also be found on our YouTube Channel.
Join the mailing list to receive information about our upcoming seminars and how to virtually participate. Joining this list is necessary to receive the password for each DOT (usually sent on the Monday of the week in which there are DOTs). You may also wish to subscribe to our Google calendar (also available in ics format).
Upcoming Talks
A Knowledge Compilation Take on Binary Polynomial Optimization
Abstract: The Binary Polynomial Optimization (BPO) problem is defined as the problem of maximizing a given polynomial function over all binary points. The main contribution of this paper is to draw a novel connection between BPO and the problem of finding the maximal assignment for a Boolean function with weights on variables. This connection allows us to give a strongly polynomial algorithm that solves BPO with a hypergraph that is either $\beta$-acyclic or with bounded incidence treewidth. This result unifies and significantly extends the known tractable classes of BPO. The generality of our technique allows us to deal also with extensions of BPO, where we enforce extended cardinality constraints on the set of binary points, and where we seek $k$ best feasible solutions. We also extend our results to the significantly more general problem where variables are replaced by literals. Preliminary computational results show that the resulting algorithms can be significantly faster than current state-of-the-art.
Bio: Silvia is a Maître de Conférences at the Sorbonne Paris Nord University and at the Northern Paris Computer Science Lab (LIPN), where she is part of the Algorithms and Combinatorial Optimization group. Prior to this, she obtained her PhD at the University of Wisconsin-Madison and then worked as a postdoctoral researcher at TU Dresden.
Building upon linear MIP techniques to solve convex MINLP
Abstract: Convex mixed-integer nonlinear programming (MINLP) problems can be seen as a natural extension of mixed-integer linear programming by allowing nonlinear expressions to be used in the problem formulation. We can consider a MINLP instance to be convex if a continuous relaxation results in a convex nonlinear program, but most algorithms require the nonlinear functions to be convex. Convexity of the functions can be exploited to derive efficient decomposition algorithms, and within a branch-and-bound framework one only has to branch on the discrete variables as a convex relaxation is directly obtained by a continuous relaxation.
Today, so-called outer approximation-based algorithms remain computationally more efficient than pure nonlinear branch and bound. The main reasons behind this are the computational efficiency and robustness of LP solvers (we can explore nodes quickly with LP relaxations), and the mature technology of cutting planes in MILP.
Deriving valid inequalities from convex nonlinear inequalities is trivial as any gradient cut (first-order Taylor Series expansion) results in a valid inequality, but obtaining strong cutting planes is not trivial. The strength of the cuts are of great importance, as weak cuts can lead to more iterations and computationally more challenging subproblems. In the presentation, we will cover some techniques for generating strong cutting planes based on convex nonlinear constraints and discuss possible research directions for further enhancements.
Bio:
Jan Kronqvist is an Assistant Professor in the Department of Mathematics at KTH Royal Institute of Technology in Stockholm, Sweden, since 2021. Jan is also a Digital Futures faculty member, and board member of the Swedish Operations Research Association (SOAF). He is also an editorial board member of the Journal of Global Optimization. Jan’s main research interests are focused around mixed-integer optimization, for example, strong and computationally efficient relaxations, decomposition algorithms, optimization software, and mixed-integer optimization for AI and ML applications.
Before joining KTH, Jan was a Newton International Fellow at Imperial College London at the Department of Computing. Jan did his PhD at Åbo Akademi University in Finland, graduated in 2018, and was awarded best PhD thesis at the Faculty of Science and Engineering. During his PhD he was also a visiting PhD student at Carnegie Mellon University. Jan received the Rosenbrock Prize 2021 for best paper in the journal Optimization and Engineering. The paper “Between steps: Intermediate relaxations between big-M and convex hull formulations” was also chosen as best paper at CPAIOR 2021. In 2018, Jan was one of the winners of the COIN-OR Cup for the work on the SHOT solver which remains one of the most efficient solvers for convex MINLP.
For questions or suggestions, please contact Aleksandr M. Kazachkov or Elias B. Khalil.
DOTs are supported by the Canada Excellence Research Chair in Data Science for Real-Time Decision-Making and the University of Florida Department of Industrial and Systems Engineering.