2026-04-16 10:30:00
Salle B107, bâtiment B, Université de Villetaneuse
The k-Edge-Connected Subgraph problem and the k-Connectivity Augmentation problem are among the most basic Network Design problems and, consequently, have been heavily studied. Due to their approximation hardness, the gold standard in terms of approximation guarantee are
strong constant factors. Interestingly, this approximation hardness does not carry over to planar graphs. In particular, the 2-Edge-Connected Subgraph problem admits a PTAS on planar graphs. However, the used techniques are very different from the celebrated Baker’s framework, which is a standard way to design PTASs for planar graphs. The main obstacle of using Baker’s technique in its classical form is that it requires a certain locality of the problem. However, k-edge/vertex-connectivity are global properties. We present a novel, and arguably clean, way to extend Baker’s framework to deal with larger connectivity requirements. Based on this, we obtain a PTAS for the k-Edge-Connected Subgraph problem and its vertex analog, even with costs, as long as the max-to-min cost ratio is bounded by a constant. Moreover, together with further insights, we obtain a PTAS for the k-Connectivity Augmentation problem in the same cost setting. We complement this with an NP-hardness result for planar augmentation, showing that all our results are essentially tight.
This is joint work with Vera Traub and Rico Zenklusen.
Warm-Starting QAOA for Combinatorial Optimization via Difference-of-Convex Optimization - A Case Study on Max-Cut
2026-03-20 14:00:00
Salle G202, Université de Villetaneuse
The Quantum Approximate Optimization Algorithm (QAOA) has recently been proposed as a heuristic framework for solving combinatorial optimization problems through a hybrid classical–quantum optimization procedure. The algorithm alternates parameterized quantum transformations with a classical optimization step that adjusts the circuit parameters in order to increase the probability of sampling high-quality solutions. A key factor influencing the performance of QAOA is the choice of the initial state. In standard implementations, the algorithm starts from a uniform superposition over all candidate solutions, which does not exploit structural information about the original optimization problem and may lead to inefficient parameter optimization and lower-quality solutions.
In this talk, we propose a warm-start strategy based on continuous optimization, using the Difference-of-Convex Algorithm (DCA). The idea is to exploit a continuous relaxation of the original optimization problem in order to construct an informed initialization that biases the search toward promising regions of the solution space. We illustrate the approach on instances of the Max-Cut problem and show that this strategy can significantly improve the approximation ratios obtained by QAOA. This is a joint work with HA Huy Phuc Nguyen et TA Anh Son.
Positive spanning sets and their connections to polyhedra
2026-03-19 10:30:00
Salle G205, Université de Villetaneuse
Positive spanning sets (PSSs), that span a given space through nonnegative linear combinations, have been successfully employed to design and analyze derivative-free optimization algorithms. Although linear algebra is a natural framework for studying PSSs, polyhedral geometry can provide additional insights on the structure of PSSs.
In this talk, I will first introduce the concept of positive spanning sets, together with its use in derivative-free optimization. I will then focus on the specific case of polyhedral constrained problems, and explain how to generate positive spanning sets that conform to the geometry of those constraints. Finally, I will turn to a perhaps unexpected construction of PSSs of smallest cardinality through polytopes, and discuss several associated open questions.
This talk is based on joint works with Denis Cornaz, Sébastien Kerleau and Lindon Roberts.
Properties of matroids in picking games against Greedy
2026-02-12 10:30:00
Salle A303, bâtiment A, Université de Villetaneuse
Given an hypergraph on a set of n ordered vertices, we define an independent set X to be
feasible, if X is a possible outcome for a player in a sequential picking game, against a greedy
adversary, where no hyperedge can be contained in the union of both outcomes. We prove that
testing feasibility is NP-complete, even if the hypergraph is a graph, but it becomes polynomial
(in n) for matroid hypergraphs, that is, when the hyperedges are the circuits of some matroid
(in which independence can be tested with an oracle). We prove also that optimizing a linear
function over feasible sets is NP-hard for graphs and matroid hypergraphs, even for graphic
matroids, but it becomes polynomial for laminar matroids.
2026-02-05 10:30:00
Salle B107, bâtiment B, Université de Villetaneuse
Betweenness centrality (BC), introduced in 1977, is a fundamental measure of node importance in networks, widely used in fields ranging from sociology to computer science. BC quantifies the extent to which a node lies on shortest paths between pairs of nodes, making its computation closely tied to the enumeration of these paths. In this work, we investigate the computational complexity of determining BC for all nodes in a graph, highlighting the challenges associated with exhaustive shortest-path counting. We further examine extensions of BC to dynamic graphs, where edges carry temporal information and optimal paths are determined not only by topology but also by timing constraints (i.e., fastest paths). We explore the hardness of computing BC under such dynamic conditions and discuss how temporal dependencies complicate classical shortest-path approaches. Our study aims to unify understanding of BC computation across static and temporal graph models and to identify open problems in efficiently counting relevant paths in these settings.
On Multidimensional Disjunctive Inequalities for Chance-Constrained Stochastic Problems with Finite Support
2026-01-29 10:30:00
Salle B107, bâtiment B, Université de Villetaneuse
This presentation addresses linear Chance-Constrained Stochastic Problems (CCSPs) with
finite support. We begin by motivating the study of CCSPs through illustrative examples and
providing intuition regarding the concept of feasibility in this context. Subsequently, we discuss
the computational challenges inherent to these problems, specifically the nonconvex structure
of the feasible region and the limitations of the standard big-M reformulation. These challenges
necessitate the use of branch-and-cut approaches.
To this end, we review existing families of valid inequalities, such as quantile inequalities
and mixing inequalities. This background sets the stage for the primary contribution of
this work: a new class of valid inequalities termed multi-disjunctive inequalities. We construct
these inequalities by exploiting a disjunctive property inherent to the mathematical formulation
of CCSPs. Theoretical analysis reveals that the closure of these multi-disjunctive inequalities
constitutes a proper subset of the closure generated by previously proposed families.
We perform numerical experiments within a pure cutting-plane framework to compare the
closures obtained by enumerating all violated valid inequalities. The results demonstrate that
multi-disjunctive inequalities significantly strengthen the continuous relaxation of the considered
CCSPs compared to existing quantile and mixing-set inequalities. Furthermore, we evaluate the
performance of these inequalities embedded within a branch-and-cut framework. Our results
indicate that the proposed approach significantly outperforms existing methods on both standard
literature instances and newly generated instances designed to be computationally challenging.