Soutenance de thèse - Salem Lahlou
Dear all / Bonjour à tous,
Vous êtes cordialement invité.e.s à la soutenance de thèse de Salem Lahlou, le 5 décembre à 9h30. (Présentation hybride).
Title: Advances in uncertainty modelling : from epistemic uncertainty estimation to generalized generative flow networks
Date: December, 5th, 2023 at 9:30am-12:00pm EST
Location: Auditorium 1 - 6650 rue Saint Urbain
Link: Lien zoom
Jury
Président | Lacoste-Julien, Simon |
Directeur de recherche | Bengio, Yoshua |
Membre du jury | Mitliagkas, Ioannis |
Examinateur externe | Andersson, Naesseth (Univ. d'Amsterdam) |
Représentant FAS | Benigni, Lucas |
Abstract
Decision-making problems often occur under uncertainty, encompassing both aleatoric uncertainty arising from inherent randomness in processes and epistemic uncertainty due to limited knowledge. This thesis explores the concept of uncertainty, a crucial aspect of machine learning and a key factor for rational agents to determine where to allocate their resources for achieving the best possible results. Traditionally, uncertainty is encoded in a posterior distribution, obtained by approximate Bayesian inference techniques. This thesis’s first set of contributions revolves around the mathematical properties of generative flow networks, which are probabilistic models over discrete sequences and amortized samplers of unnormalized probability distributions. Generative flow networks find applications in Bayesian inference and can be used for uncertainty estimation. Additionally, they are helpful for search problems in large compositional spaces. Beyond deepening the mathematical framework underlying them, a comparative study with hierarchical variational methods is provided, shedding light on the significant advantages of generative flow networks, both from a theoretical point of view and via diverse experiments. These contributions include a theory extending generative flow networks to continuous or more general spaces, which allows modelling the Bayesian posterior and uncertainty in many interesting settings. The theory is experimentally validated in various domains. This thesis’s second line of work is about alternative measures of epistemic uncertainty beyond posterior modelling. The presented method, called Direct Epistemic Uncertainty Estimation (DEUP), overcomes a major shortcoming of approximate Bayesian inference techniques caused by model misspecification. DEUP relies on maintaining a secondary predictor of the errors of the main predictor, from which measures of epistemic uncertainty can be deduced.