Supervisors:
Anne Auger,
Nikolaus Hansen
Context
Evolution strategies are search or optimization algorithms that search for good solutions in continuous domain search spaces, where good is defined by a given fitness or objective function. Often the fitness function is assumed to be a
black-box, for example simulated by a comparatively complex computer program. Originally inspired by biological evolution almost fifty years ago, evolution strategies have matured and become competitive and comparatively well-understood. The
covariance matrix adaptation evolution strategy (CMA-ES) is a modern variant that samples new solutions from a
multivariate normal distribution and adapts variances and covariances of this distribution. Many interesting interpretations and justifications of the CMA-ES have been proposed and most recently it has been shown to conduct a
natural gradient descent in the distribution space (as opposed to the search space). Furthermore, the CMA-ES has proven to be useful on a wide range of applications such as model calibration and shape optimization. We propose several topics connected to CMA-ES. More or different topics might be available on request.
What can you learn?
The objective of all our projects is to learn how to address a small, or not so small, scientific question in a scientific way. The typical means for addressing a question are careful thinking, skepticism, a pencil and paper, (basic) math, writing (small) programs, running (small or not so small) computer simulations, careful interpretation of results...and many iterations...
Subjects
Injecting proposal solutions
The CMA-ES is a carefully designed method that exploits in several steps that the sampled solutions stem from a normal distribution. This is usually an advantage, but can lead to a failure, if solutions from a different distribution are injected in the algorithm. Injecting (good) solutions indeed can be useful in many different contexts. The objective of this work is to identify and understand the mechanisms of failure and find a resolution. The typical steps in this kind of algorithm design task are
- setup of a prototypical, fast to simulate scenario and identification of the problem(s)
- figure out possible solutions
- rapid prototyping of possible solutions and of online tests that serve to try ti falsify the solutions on-the-fly
- the surviving solution(s) become candidate(s) of a more thorough empirical study which includes
- design of test cases
- design of experiments
- presentation and interpretation of the results
The emphasize in this project is not so much on exhaustive empirical investigations, but on understanding algorithms.
Mirroring of proposal solutions
Recently, our group has analyzed the effects of sampling
pairwise mirrored solutions around a symmetry point, the distribution mean. This technique resembles the computation of a gradient by
central finite differences. The technique can lead to a non-negligible improvement of the convergence rate, see e.g.
Brockhoff et al 2010. Following up these results, we want to investigate the effect of mirroring on the covariance matrix adaptation mechanism. First, possible algorithm variants are considered in the gedankenexperiment. Promising variants will be implemented and empirically investigated under different scenarios.
More recently, a rank-based uncertainty (or noise) measurement has been proposed in the context of CMA-ES
Hansen et al 2009. The measurement counts rank changes induced by reevaluation of solutions. This more theoretical task will try to reformulate the algorithm using well-known rank-correlation coeffients, specifically the
Kendall tau. For this task both computations have to be deeply understood and formulated within a single algorithm. After the differences has been identified, a second, empirical part, will try to achieve a quantification thereof.
Solving packing problems
The task is to apply CMA-ES to packing problems, see
Packomania. The main steps are
- careful consideration of the formulation of the actually used fitness function
- design of the experiments
- presentation and interpretation of the results
- comparison of the result obtained with result of other competitive methods
- (optional) writing a demo software
Exploring Similarities and Differences between CMA-ES and the Cross-Entropy method
The
cross-entropy method is a recent popular optimization algorithm. Its continuous variant samples a population from a multivariate normal distribution and adapts the parameter of the distribution using minimization of the Kullback-Leibler divergence. The internship will consist in exploring similarities and differences between the cross-entropy method and the CMA-ES algorithm. The main steps will be to
- identify theoretically similarities and differences
- come up with an experimental set-up to quantify the impact of the differences observed on the performances of the methods
- implement the experiments and compare the results
Prerequisits
- bacis english knowledge
- interest in scientific questions
- basic math and programming skills or talent
- willingness to learn and improve
In case of interest please contact anne.auger {at} lri.fr or nikolaus.hansen {at} lri.fr with the subject line "stage autour de CMA-ES" (this is how we find out you have found this page read it to the end).