21 Novembre - 27 Novembre


Retour à la vue des calendrier
Jeudi 24 Novembre
Heure: 10:30 - 11:30
Lieu: https://bbb.lipn.univ-paris13.fr/b/wol-ma9-vjn - code: 514019
Résumé: Combinatorial solvers and neural networks
Description: Pasquale Minervini Combining discrete probability distributions and combinatorial optimization problems with neural network components has numerous applications but poses several challenges. We propose Implicit Maximum Likelihood Estimation (IMLE), a framework for end-to-end learning of models combining discrete exponential family distributions and differentiable neural components. IMLE is widely applicable as it only requires the ability to compute the most probable states and does not rely on smooth relaxations. The framework encompasses several approaches such as perturbation-based implicit differentiation and recent methods to differentiate through black-box combinatorial solvers. Moreover, we show that IMLE simplifies to maximum likelihood estimation when used in some recently studied learning settings that involve combinatorial solvers. One limitation of IMLE is that, due to the finite difference approximation of the gradients, it can be especially sensitive to the choice of the finite difference step size which needs to be specified by the user. In this presentation, we also introduce Adaptive IMLE (AIMLE), the first adaptive gradient estimator for complex discrete distributions: it adaptively identifies the target distribution for IMLE by trading off the density of gradient information with the degree of bias in the gradient estimates. We empirically evaluate our estimator on synthetic examples, as well as on Learning to Explain, Discrete Variational Auto-Encoders, and Neural Relational Inference tasks. In our experiments, we show that our adaptive gradient estimator can produce faithful estimates while requiring orders of magnitude fewer samples than other gradient estimators.