Message > Soutenance de thèse N. Belkhir : IA/ML/Optim
Le 14/11/2017 par nojhan :
Nous avons le plaisir de vous inviter à la soutenance de doctorat de M. Nacim Belkhir.
Ses travaux en informatique, concernant l'emploi d'apprentissage automatique pour la configuration d'algorithme d'optimisation, ont été préparés à l’université Paris-Sud (école doctorale Sciences et Technologies de l’Information et de la Communication) et au centre de recherche de Thales.
La thèse est encadrée par Marc Schoenauer (INRIA), Johann Dréo et Pierre Savéant (Thales), avec pour sujet :
« Per Instance Algorithm Configuration for Continuous Black Box Optimization »
La soutenance aura lieu le :
Lundi 20 novembre à 14h;
à l'Amphithéâtre Shannon du Laboratoire de Recherche en Informatique de l'Université Paris-Sud :
Bâtiment 660 Claude Shannon,
Localisation précise : https://www.google.fr/maps/dir/Le+Guichet/660,+Rue+Noetzlin,+91190+Gif-sur-Yvettefirstname.lastname@example.org,2.1674808,17.75z/data=!4m14!4m13!1m5!1m1!1s0x47e678a46da4d41f:0x98e8cdc7ddb3480a!2m2!1d2.191621!2d48.704825!1m5!1m1!1s0x47e67f4f1706c115:0xf7f497934eca22a1!2m2!1d2.1666219!2d48.7124608!3e2?hl=fr
Le jury sera composé des membres suivants :
– M. Thomas Stützle (rapporteur, université libre de Bruxelles),
– M. Frédéric Saubion (rapporteur, université d'Angers),
– M. Emmanuel Vasquez (Central Supelec),
– M. Sébastien Vérel (université du littoral côte d'Opale),
– M. Amir Nakib (université Paris-est Créteil).
Vous trouverez un résumé des travaux ci-dessous.
Evolutionary Algorithms (EAs) have been widely investigated in the literature and in many industrial contexts to solve black box optimization problems. They have been demonstrated to be able to solve a wide range of optimization problems. However, despite many successful results, it is widely acknowledged in the optimization community that the quest of a general algorithm that would solve any optimization is vain.
This PhD thesis focuses on the automated algorithm configuration that aims at finding the ‘best’ parameter setting for a given problem or a class of problem, where the notion of ‘best’ is related to some user-defined performance measure, usually some balance between the computational cost of the optimization and a domain-specific measure of quality (e.g., the precision of the solution in the continuous domain). The Algorithm Configuration problem thus amounts to a meta-optimization problem in the space of parameters, whose meta-objective is the performance measure of the given algorithm at hand with a given parameter configuration. However, in the continuous domain, such method can only be empirically assessed at the cost of running the algorithm on some problem instances, hence making the task of a direct meta-optimization immensely costly, either for one or a set of problems.
More recent approaches rely on a description of the objective functions in some features space, and try to learn a mapping from this feature space onto the space of parameter configurations of the algorithm at hand, based on examples of the behavior of several configurations on a training set of objective functions. Along these lines, this PhD thesis focuses on the Per Instance Algorithm Configuration (PIAC) for solving continuous black box optimization problems, where only a limited budget of function evaluations is available.
We first survey Evolutionary Algorithms for continuous optimization, with a focus on two algorithms that we have used as target algorithm for PIAC, DE and CMA-ES. Next, we review the state of the art of Algorithm Configuration approaches, and the different features that have been proposed in the literature to describe continuous black box optimization problems.
Because, in the continuous domain, features are computed from a sample of objective function values, we investigate their computation when only a small budget is available, and we propose a novel methodology based on surrogate modeling in order to artificially augment the sample set. The resulting features allow to reduce the computation of the sub-sampled features and slightly improve the efficiency of the features for solving a classification task of optimization problem.
We then introduce a general methodology to empirically study PIAC for the continuous domain, so that all the components of PIAC can be explored in real-world conditions. To this end, we also introduce a new continuous black box test bench, distinct from the famous BBOB benchmark, that is composed of a several multi-dimensional test functions with different problem properties, gathered from the literature.
The methodology is finally applied to two EAs. First we use Differential Evolution as target algorithm, and explore all the components of PIAC, such that we empirically assess the best. Second, based on the results on DE, we empirically investigate PIAC with Covariance Matrix Adaptation Evolution Strategy (CMA-ES) as target algorithm. Both use cases empirically validate the proposed methodology on the new black box testbench for dimensions up to 100.