Discrete Optimization Talks (DOTs)

DOTs are virtual discrete optimization talks, organized by Aleksandr M. Kazachkov and Elias B. Khalil. To receive updates (and the Zoom link) for upcoming DOTs, please join our mailing list. If you would like to give a DOT, please fill out this form or email us.

Topics of interest include theoretical, computational, and applied aspects of integer and combinatorial optimization.

The format is a weekly session with two 30-minute talks. Currently, the seminars are scheduled on Tuesdays at 2:00 p.m. ET. A special feature of DOTs will be a social component. After a usual talk, you might grab a tea/coffee and chat with other attendees. Why not here too? Join us for some informal discussion after each DOT and throughout the week on our Discord channel.

Videos of past DOTs are posted under Past Talks and can also be found on our Youtube Channel.

Join the mailing list to receive information about our upcoming seminars and how to virtually participate. Joining this list is necessary to receive the password for each DOT (usually sent on Monday mornings). You may also wish to subscribe to our Google calendar (also available in ics format).

Upcoming Talks

Tuesday, July 14, 2020

Jannis Kurtz
University of Siegen

We study the problem of signal recovery for group models. More precisely for a given set of groups, each containing a small subset of indices, and for given linear sketches of the true signal vector which is known to be group-sparse in the sense that its support is contained in the union of a small number of these groups, we study algorithms which successfully recover the true signal just by the knowledge of its linear sketches. We consider two versions of the classical Iterative Hard Thresholding algorithm (IHT). The classical version iteratively calculates the exact projection of a vector onto the group model, while the approximate version (AM-IHT) uses a head- and a tail-approximation iteratively. We apply both variants to group models and analyse the two cases where the sensing matrix is a Gaussian matrix and a model expander matrix.

To solve the exact projection problem on the group model, which is known to be equivalent to the maximum weight coverage problem, we use discrete optimization methods based on dynamic programming and Benders' Decomposition. The head- and tail-approximations are derived by a classical greedy-method and LP-rounding, respectively.

Instead of a second DOT on July 14th, we will have a social gathering on Zoom, starting at around 2:35 p.m. ET.

We then take our summer break, but fear not, DOTs will return. We would appreciate your feedback on the format in which DOTs should continue. To this end, please fill out this (relatively short) Google form.

Confirmed Speakers, Date TBD

Andrea Lodi
CERC - Polytechnique Montréal