Ph.D
Group : Learning and Optimization
Surrogate-Assisted Evolutionary Algorithms
Starts on 01/09/2009
Advisor : SCHOENAUER, Marc
Funding : CDD sur contrat INRIA
Affiliation : Université Paris-Saclay
Laboratory : LRI AO
Defended on 08/01/2013, committee :
- Frédéric Bonnans, DR INRIA, INRIA Saclay, Ecole Polytechnique, France (Examinateur)
- Michael Emmerich, Assistant Professor, Leiden University, The Netherlands (Examinateur)
- Christian Igel, Professor, University of Copenhagen, Denmark (Rapporteur)
- Una-May O’Reilly, Principal Research Scientist, MIT CSAIL, USA (Examinatrice)
- Marc Schoenauer, DR INRIA, INRIA Saclay, LRI/TAO, France (Directeur de thèse)
- Michèle Sebag, DR CNRS, University of Paris-Sud 11, LRI/TAO, France (Directrice de thèse)
- François Yvon, Professor, University of Paris-Sud 11, LIMSI/TLP, France (Examinateur)
Rapporteurs:
- Christian Igel, Professor, University of Copenhagen, Denmark
- Yaochu Jin, Professor, University of Surrey, UK
Research activities :
Abstract :
Evolutionary Algorithms (EAs) have received a lot of attention regarding their potential to solve complex optimization problems using problem-specific variation operators.
A search directed by a population of candidate solutions is quite robust with respect to a moderate noise and multi-modality of the optimized function, in contrast to some classical optimization methods such as quasi-Newton methods. The main limitation of EAs, the large number of function evaluations required,
prevents from using EAs on computationally expensive problems, where one evaluation takes much longer than 1 second.
The present thesis focuses on an evolutionary algorithm, Covariance Matrix Adaptation Evolution Strategy (CMA-ES), which has become a standard powerful tool for continuous black-box optimization. We present several state-of-the-art algorithms, derived from CMA-ES, for solving single- and multi-objective black-box optimization problems.
First, in order to deal with expensive optimization, we propose to use comparison-based surrogate (approximation) models of the optimized function,
which do not exploit function values of candidate solutions, but only their quality-based ranking.
The resulting self-adaptive surrogate-assisted CMA-ES represents a tight coupling of statistical machine learning and CMA-ES, where a surrogate model is build, taking advantage of the function topology given by the covariance matrix adapted by CMA-ES. This allows to preserve two key invariance properties of CMA-ES: invariance with respect to i). monotonous transformation of the function, and ii). orthogonal transformation of the search space.
For multi-objective optimization we propose two mono-surrogate approaches: i). a mixed variant of One Class Support Vector Machine (SVM) for dominated points and Regression SVM for non-dominated points; ii). Ranking SVM for preference learning of candidate solutions in the multi-objective space. We further integrate these two approaches into multi-objective CMA-ES (MO-CMA-ES) and discuss aspects of surrogate-model exploitation.
Second, we introduce and discuss various algorithms, developed to understand, explore and expand frontiers of the Evolutionary Computation domain, and CMA-ES in particular. We introduce linear time Adaptive Coordinate Descent method for non-linear optimization, which inherits a CMA-like procedure of adaptation of an appropriate coordinate system without losing the initial simplicity of Coordinate Descent.
For multi-modal optimization we propose to adaptively select the most suitable regime of restarts of CMA-ES and introduce corresponding alternative restart strategies.
For multi-objective optimization we analyze case studies, where original parent selection procedures of MO-CMA-ES are inefficient, and introduce reward-based parent selection strategies, focused on a comparative success of generated solutions.