2021 Genetic and Evolutionary Computation Conference (GECCO'21)
Lille, France, July 10-14, 2021
http://iao.hfuu.edu.cn/aaboh21
The Analysing Algorithmic Behaviour of Optimisation Heuristics Workshop (AABOH), as part of the 2021 Genetic and Evolutionary Computation Conference (GECCO'21), invited the submission of original and unpublished research papers. Here you can download the AABOH Special Session Call for Papers (CfP) in PDF format and here as plain text file.
Optimisation and Machine Learning tools are among the most used tools in the modern world with its omnipresent computing devices. Yet, the dynamics of these tools have not been analysed in detail. Such scarcity of knowledge on the inner workings of heuristic methods is largely attributed to the complexity of the underlying processes that cannot be subjected to a complete theoretical analysis. However, this is also partially due to a superficial experimental set-up and, therefore, a superficial interpretation of numerical results. Indeed, researchers and practitioners typically only look at the final result produced by these methods. Meanwhile, the vast amount of information collected over the run(s) is wasted. In the light of such considerations, it is now becoming more evident that such information can be useful and that some design principles should be defined that allow for online or offline analysis of the processes taking place in the population and their dynamics.
Hence, with this workshop, we call for both theoretical and empirical achievements identifying the desired features of optimisation and machine learning algorithms, quantifying the importance of such features, spotting the presence of intrinsic structural biases and other undesired algorithmic flaws, studying the transitions in algorithmic behaviour in terms of convergence, any-time behaviour, performances, robustness, etc., with the goal of gathering the most recent advances to fill the aforementioned knowledge gap and disseminate the current state-of-the-art within the research community.
Topics of Interest
We encourage submissions exploiting carefully designed experiments or data-heavy approaches that can come to help in analysing primary algorithmic behaviours and modelling internal dynamics causing them. As an indication, some (but not all) relevant topics of interests are reported in the list below:
- global search vs. local search,
- exploration vs. exploitation,
- time and space complexity,
- premature convergence and stagnation,
- structural bias,
- genotypic or phenotypic diversity,
- robustness of the produced solution,
- secondary benchmarking,
- anytime performance.
All accepted papers of this workshop will be included in the Proceedings of the Genetic and Evolutionary Computation Conference (GECCO'21) Companion Volume.
Program
The AABOH workshop took place on Sunday, July 11, 2021 in two sessions from 13:30-15:20 and 16:00-17:50.
- Session 1: 13:30-15:20
- 13:32-13:50: Lucija Planinić, Marko Durasevic, Luca Mariot, Domagoj Jakobovic, Stjepan Picek, and Carlos Coello Coello. On the Genotype Compression and Expansion for Evolutionary Algorithms in the Continuous Domain. doi:10.1145/3449726.3463169.
This paper investigates the influence of genotype size on evolutionary algorithms' performance. We consider genotype compression (where genotype is smaller than phenotype) and expansion (genotype is larger than phenotype) and define different strategies to reconstruct the original variables of the phenotype from both the compressed and expanded genotypes. We test our approach with several evolutionary algorithms over three sets of optimization problems: COCO benchmark functions, modeling of Physical Unclonable Functions, and neural network weight optimization. Our results show that genotype expansion works significantly better than compression, and in many scenarios, outperforms the original genotype encoding. This could be attributed to the change in the genotype-phenotype mapping introduced with the expansion methods: this modification beneficially transforms the domain landscape and alleviates the search space traversal.
- 13:50-14:08: Rick Boks, Anna Kononova, and Hao Wang. Quantifying the Impact of Boundary Constraint Handling Methods on Differential Evolution. doi:10.1145/3449726.3463214.
Constraint handling is one of the most influential aspects of applying metaheuristics to real-world applications, which can hamper the search progress if treated improperly. In this work, we focus on a particular case - the box constraints, for which many boundary constraint handling methods (BCHMs) have been proposed. We call for the necessity of studying the impact of BCHMs on metaheuristics' performance and behavior, which receives seemingly little attention in the field. We target quantifying such impacts through systematic benchmarking by investigating 28 major variants of Differential Evolution (DE) taken from the modular DE framework (by combining different mutation and crossover operators) and 13 commonly applied BCHMs, resulting in 28*13=364 algorithm instances after pairing DE variants with BCHMs. After executing the algorithm instances on the well-known BBOB/COCO problem set, we analyze the best-reached objective function value (performance-wise) and the percentage of repaired solutions (behavioral) using statistical ranking methods for each combination of mutation, crossover, and BBOB function group. Our results clearly show that the choice of BCHMs substantially affects the empirical performance as well as the number of generated infeasible solutions, which allows us to provide general guidelines for selecting an appropriate BCHM for a given scenario.
- 14:08-14:26: Tobias van Driessel and Dirk Thierens. Benchmark Generator for TD Mk Landscapes. doi:10.1145/3449726.3463177.
We introduce a publicly available benchmark generator for Tree Decomposition (TD) Mk Landscapes. TD Mk Landscapes were introduced by Whitley et al. to get rid of unnecessary restrictions of Adjacent NK Landscapes while still allowing for the calculation of the global optimum in polynomial time. This makes TD Mk Landscapes more lenient while still being as convenient as Adjacent NK Landscapes. Together, these properties make it very suitable for benchmarking blackbox algorithms. Whitley et al., however, introduced a construction algorithm that only constructs Adjacent NK Landscapes. Recently, Thierens et al. introduced an algorithm, CliqueTreeMk, to construct any TD Mk Landscape and find its optimum. In this work, we introduce CliqueTreeMk in more detail, implement it for public use, and show some results for LT-GOMEA on an example TD Mk Landscape problem. The results show that deceptive trap problems with higher overlap do not necessarily decrease performance and effectiveness for LT-GOMEA.
- 14:26-14:44: Helena Stegherr and Michael Heider. Design of Large-Scale Metaheuristic Component Studies. doi:10.1145/3449726.3463168.
Metaheuristics employ a variety of different components using a wide array of operators to execute their search. This determines their intensification, diversification and all other behavioural features and is thus critical for success on different optimisation problems. Choosing the right components with the right operators remains a difficult task. In this paper we propose a design of experiments that should be used for extensive component studies. We demonstrate the applicability of this design by exploring the differences in operator specific performance in two closely related metaheuristic frameworks — the well-known (μ +λ)-Evolution Strategy and the strongly metaphor-focussed Invasive Weed Optimisation — where operators show varying degrees of similarity in different components. This experiment shows that similarity of operators does not comprehensively account for similarity in performance. Presumably small changes of an operator can influence the algorithmic behaviour more than the utilisation of a completely different operator in another component. Even when employed in different combinations, these influences remain strong. This emphasises the need for a more detailed analysis of the specific effects of components and their respective operators on the search process.
- 14:44-15:02: Bas van Stein, Fabio Caraffini, and Anna Kononova. Emergence of Structural Bias in Differential Evolution. doi:10.1145/3449726.3463223.
Heuristic optimisation algorithms are in high demand due to the overwhelming amount of complex optimisation problems that need to be solved. The complexity of these problems is well beyond the boundaries of applicability of exact optimisation algorithms and therefore require modern heuristics to find feasible solutions quickly. These heuristics and their effects are almost always evaluated and explained by particular problem instances. In previous works, it has been shown that many such algorithms show structural bias, by either being attracted to a certain region of the search space or by consistently avoiding regions of the search space, on a special test function designed to ensure uniform 'exploration' of the domain. In this paper, we analyse the emergence of such structural bias for Differential Evolution (DE) configurations and, specifically, the effect of different mutation, crossover and correction strategies. We also analyse the emergence of the structural bias during the run-time of each algorithm. We conclude with recommendations of which configurations should be avoided in order to run DE unbiased.
- 15:02-15:20. Diederick Vermetten, Anna Kononova, Fabio Caraffini, Hao Wang, and Thomas Bäck. Is there Anisotropy in Structural Bias? doi:10.1145/3449726.3463218.
Structural Bias (SB) is an important type of algorithmic deficiency within iterative optimisation heuristics. However, methods for detecting structural bias have not yet fully matured, and recent studies have uncovered many interesting questions. One of these is the question of how structural bias can be related to anisotropy. Intuitively, an algorithm that is not isotropic would be considered structurally biased. However, there have been cases where algorithms appear to only show SB in some dimensions. As such, we investigate whether these algorithms actually exhibit anisotropy, and how this impacts the detection of SB. We find that anisotropy is very rare, and even in cases where it is present, there are clear tests for SB which do not rely on any assumptions of isotropy, so we can safely expand the suite of SB tests to encompass these kinds of deficiencies not found by the original tests. We propose several additional testing procedures for SB detection and aim to motivate further research into the creation of a robust portfolio of tests. This is crucial since no single test will be able to work effectively with all types of SB we identify.
- 13:32-13:50: Lucija Planinić, Marko Durasevic, Luca Mariot, Domagoj Jakobovic, Stjepan Picek, and Carlos Coello Coello. On the Genotype Compression and Expansion for Evolutionary Algorithms in the Continuous Domain. doi:10.1145/3449726.3463169.
- Session 2: 16:00-17:50
- 16:00-16:27: Introduction to the AABOH Workshop (Speaker: Anna Kononova)
- 16:27-16:55: Workshop Keynote Workshop Keynote: On the Bias of Point Distributions on the Pareto front in Indicator-Based Multiobjective Optimization by Michael Emmerich.
In a posteriori multiobjective optimization ('Pareto Optimization') by means of Indicator-Based Evolutionary Algorithms (IBEA), indicators are used to assess the quality of Pareto front approximations. Examples are Inverse Generational Distance, R2, Riesz S-Energy, and Hypervolume Indicator. A problem that arises when using these indicators is that they have a bias in the density of points that is often unintended and depends on the shape of the Pareto optimal set. An important mathematical concept is here the so-called mu-optimal distribution, i.e. the point set that maximizes an indicator given a Pareto front. It will be summarized what is understood so far about such biases of indicator optimizing distributions and results are mainly confined to low dimensions. It will also be highlighted what challenges remain. The bias might be introduced by the search behavior of the algorithm as well (search dynamics, drift effects). This latter topic has been less studied in multiobjective optimization but might also play an important role in understanding results, in particular when it comes to multimodal landscapes. The talk seeks to discuss both, results on mathematical and practical questions, related to the above questions and will highlight interesting open questions.
- 16:55-15:22. Workshop Keynote Patterns of Population Dynamics in Differential Evolution: Merging Theoretical and Empirical Results by Daniela Zaharie.
The dynamics of the population of individuals generated by Differential Evolution (DE) strategies is highly influenced by the usage of the specific difference-based mutation, the main source of variation being represented by the population diversity. The direction and magnitude of search are mainly influenced by the distribution of the current population and by the control parameters (e.g. scale factor, crossover rate) which by their interplay can lead to different behavioral patterns, including some undesired ones: premature convergence (too exploitative) and slow convergence (too explorative). Quantifying the exploration/exploitation ratio and finding the right balance which leads to an effective search are difficult tasks. However, statistical characteristics of the population (e.g., expected population variance or covariance matrix) can be used to predict, at least partially, the population dynamics and to adaptively direct the search such that to avoid undesired behaviors. For instance, premature convergence can be avoided by not systematically sampling values for the control parameters from so-called critical regions. Unfortunately, most of the theoretical results on the DE population dynamics rely on some simplifying assumptions, thus using them to guide the search process might require adjustments based on information collected during the search. Such an approach will be illustrated in the context when the DE specific dynamics is altered by repairing rules used to satisfy bounding box constraints.
- 17:22-17:50. Workshop Keynote Applications of Bayesian Optimization in Natural Language Processing by Lingpeng Kong.
When applying machine learning to problems in natural langauge processing (NLP), there are many choices to make about how to represent input texts. They can have a big effect on performance, but they are often uninteresting to researchers or practitioners who simply need a module that performs well. In this talk, we will present the use of sequential model-based optimization over this space of choices and show that it makes standard linear models competitive with more sophisticated, expensive state-ofthe-art methods based on latent variables or neural networks on various topic classification and sentiment analysis problems. We will also discuss the recent development of the large-scale pretraining/fine-tuning scheme and the influence of hyper-parameter tuning under the scheme.
Important Dates
Paper Submission Opening: | 11 | February | 2021 |
Paper Submission Deadline: | 12 | April | 2021 |
Notification of Acceptance: | 26 | April | 2021 |
Camera-Ready Copy Due: | 3 | May | 2021 |
Author Registration: | 3 | May | 2021 |
Conference Presentation: | 11 | July | 2021 |
If you have questions or suggestions, please contact Dr. Kononova at
Instructions for Authors
Our workshop invites the submission of papers of at most 8 pages (excluding references), which should present original work that meets the high-quality standards of GECCO. Accepted papers appear in the ACM digital library as part of the Companion Proceedings of GECCO.
Each paper submitted to our workshop will be rigorously evaluated in a double-blind review process. Review criteria include the significance of the work, technical soundness, novelty, clarity, writing quality, relevance and, if applicable, sufficiency of information to permit replication.
Each paper accepted needs to have at least one author registered by the author registration deadline. By submitting a paper you agree to register and present at the conference in case the paper is accepted.
Papers must be submitted via the online submission system: https://ssl.linklings.net/conferences/gecco/.
Please refer to https://gecco-2021.sigevo.org/Paper-Submission-Instructions for more detailed instructions.
Chairs
- Anna V. Kononova, LIACS, Leiden University, Leiden, The Netherlands
- Hao Wang, LIACS, Leiden University, Leiden, The Netherlands
- Thomas Weise, Institute of Applied Optimization, Hefei University, Hefei, China
- Carola Doerr, CNRS and Sorbonne University, Paris, France
- Thomas Bäck, LIACS, Leiden University, Leiden, The Netherlands
- Fabio Caraffini, De Montfort University, Leicester, UK
- Johann Dreo, Pasteur Institute and CNRS, Computational Biology Departement, USR 3756, System Biology Group
International Program Committee
- Michael Affenzeller, University of Applied Sciences Upper Austria, Hagenberg, Austria
- Andreas Beham, FH Upper Austria, Hagenberg, Austria
- Arina Buzdalova, ITMO University, St. Petersburg, Russia
- Maxim Buzdalov, ITMO University, St. Petersburg, Russia
- Tome Eftimov, Jožef Stefan Institute, Ljubljana, Slovenia
- Michael Emmerich, LIACS, Leiden University, The Netherlands
- Peter Korošec, Jožef Stefan Institute, Ljubljana, Slovenia
- Timo Kötzing, Hasso Plattner Institute, Potsdam, Germany
- Martin Krejca, Sorbonne Université, LIP6, Paris, France
- Per Kristian Lehre, University of Birmingham, Birmingham, U.K.
- Johannes Lengler, ETH Zürich, Zürich, Switzerland
- Bin Li, University of Science and Technology of China, Hefei, China
- Xinlu Li, Institute of Applied Optimization, Hefei University, Hefei, China
- Katherine Malan, University of South Africa
- Stefan Menzel, Honda Research, Offenbach/Main, Germany
- Laurent Meunier, FAIR Paris & Université Paris-Dauphine, France
- Mario Andrés Muñoz, The University of Melbourne, Australia
- Pietro S. Oliveto, University of Sheffield, UK
- Patryk Orzechowski, University of Pennsylvania, Philadelphia, PA, USA
- Mike Preuss, LIACS, Leiden University, Leiden, The Netherlands
- Jonathan E. Rowe, The University of Birmingham, The Alan Turing Institute, UK
- Jörg Stork, TH Köln, Köln, Germany
- Dirk Sudholt, University of Passau, Passau, Germany
- Dirk Thierens, Utrecht University, Utrecht, The Netherlands
- Diederick Vermetten, LIACS, Leiden University, Leiden, The Netherlands
- Stefan Wagner, FH Upper Austria, Hagenberg, Austria
- Handing Wang, Xidian University, Xidian University, China
- Carsten Witt, Technical University of Denmark, Denmark
- Rongwang Yin, Institute of Applied Optimization, Hefei University, Hefei, China
- Daniela Zaharie, West University of Timișoara, Timișoara, Romania
- Christine Zarges, Aberystwyth University, Aberystwyth, Wales, United Kingdom
- Aleš Zamuda, University of Maribor, Slovenia
- Xingyi Zhang, Anhui University, Hefei, China
Chair Biographies
Dr. Anna V. Kononova is currently an Assistant Professor at the Leiden Institute of Advanced Computer Science. She received her MSc degree in Applied Mathematics from Yaroslavl State University (Russia) in 2004 and her PhD degree in Computer Science from University of Leeds in 2010. After 5 years of postdoctoral experience at Technical University Eindhoven, the Netherlands and Heriot-Watt University, UK, Anna has spent a number of years working as a mathematician in industry. Her current research interests include analysis of optimization algorithms.
Dr. Hao Wang obtained his PhD (cum laude, promotor: Prof. Thomas Bäck) from Leiden University in 2018. He is currently employed as an assistant professor of computer science in Leiden University. Previously, he has a research stay at Sorbonne University, France (supervised by Carola Doerr). He received the Best Paper Award at the PPSN 2016 conference and was a best paper award finalist at the IEEE SMC 2017 conference. His research interests are in the analysis and improvement of efficient global optimization for mixed-continuous search spaces, Evolution strategies, Bayesian optimization, and benchmarking.
Prof. Dr. Thomas Weise obtained the MSc in Computer Science in 2005 from the Chemnitz University of Technology and his PhD from the University of Kassel in 2009. He then joined the University of Science and Technology of China (USTC) as PostDoc and subsequently became Associate Professor at the USTC-Birmingham Joint Research Institute in Intelligent Computation and Its Applications (UBRI) at USTC. In 2016, he joined Hefei University as Full Professor to found the Institute of Applied Optimization (IAO) of the School of Artificial Intelligence and Big Data. Prof. Weise has more than a decade of experience as a full time researcher in China, having contributed significantly both to fundamental as well as applied research. He has published more than 90 scientific papers in international peer reviewed journals and conferences. His book "Global Optimization Algorithms – Theory and Application" has been cited more than 900 times. He is Member of the Editorial Board of the Applied Soft Computing Journal and has acted as reviewer, editor, or program committee member at 80 different venues.
Dr. Carola Doerr, formerly Winzen, is a permanent CNRS researcher at Sorbonne University in Paris, France. Carola's main research activities are in the mathematical analysis of randomized algorithms, with a strong focus on evolutionary algorithms and other sampling- based optimization heuristics. Carola has received several awards for her work on evolutionary computation, among them the Otto Hahn Medal of the Max Planck Society and four best paper awards at GECCO. She is associate editor of ACM Transactions on Evolutionary Learning and Optimization, editorial board member of the Evolutionary Computation journal, advisory board member of the Springer Natural Computing Book Series. She was program chair of PPSN 2020, FOGA 2019, and the theory tracks of GECCO 2015 and 2017. Carola was guest editor of two special issues in Algorithmica. She is also vice chair of the EU-funded COST action 15140 on "Improving Applicability of Nature-Inspired Optimisation by Joining Theory and Practice (ImAppNIO)". She is a founding and coordinating member of the Benchmarking Network, an initiative created to consolidate and to stimulate activities on benchmarking sampling-based optimization heuristics, she has organized several workshops on benchmarking, and was co-organizer of the Open Optimization Competition 2020.
Prof. Dr. Thomas Bäck is professor of Computer Science at the Leiden Institute of Advanced Computer Science, Leiden University, Netherlands, since 2002. He received his PhD in Computer Science from Dortmund University, Germany, in 1994, and was leader of the Center for Applied Systems Analysis at the Informatik Centrum Dortmund until 2000. Until 2009, Thomas was also CTO of NuTech Solutions, Inc. (Charlotte, NC), where he gained ample experience in solving real-world problems in optimization and data analytics, by working with global enterprises in the automotive and other industry sectors. Thomas received the IEEE Computational Intelligence Society (CIS) Evolutionary Computation Pioneer Award for his contributions in synthesizing evolutionary computation (2015), was elected as a fellow of the International Society of Genetic and Evolutionary Computation (ISGEC) for fundamental contributions to the field (2003), and received the best dissertation award from the "Gesellschaft für Informatik" in 1995. Thomas has more than 300 publications, as well as two books on evolutionary algorithms: Evolutionary Algorithms in Theory and Practice (1996), Contemporary Evolution Strategies (2013). He is co- editor of the Handbook of Evolutionary Computation and the Handbook of Natural Computing.
Dr. Fabio Caraffini is currently an Associate Professor in Computer Science and Mathematics at De Montfort University (UK) and a Fellow of the Higher Education Academy (UK). He received the BSc in "Electronics Engineering" and the MSc in "Telecommunications Engineering" degrees from the University of Perugia (Italy) in 2008 and 2011 respectively. Fabio holds a PhD in Computer Science (De Montfort University, UK, 2014) and a PhD in Computational Mathematics (University of Jyväskylä, Finland, 2016). His research interests include theoretical and applied computational intelligence with a strong emphasis on metaheuristics for optimization.
Dr. Johann Dreo works at the Pasteur Institute and CNRS, Computational Biology Departement, USR 3756, System Biology Group. His scientific interests are in optimization, search heuristics, artificial intelligence, machine learning, algorithm design and engineering, automated planning, and differential geometry. He has more than 13 years of expertise of applying randomized optimization heuristics in practice.
Hosting Event
The 2021 Genetic and Evolutionary Computation Conference (GECCO'21), Lille, France, July 10-14, 2021.
The Genetic and Evolutionary Computation Conference (GECCO) presents the latest high-quality results in genetic and evolutionary computation since 1999. Topics include: genetic algorithms, genetic programming, ant colony optimization and swarm intelligence, complex systems (artificial life, robotics, evolvable hardware, generative and developmental systems, artificial immune systems), digital entertainment technologies and arts, evolutionary combinatorial optimization and metaheuristics, evolutionary machine learning, evolutionary multiobjective optimization, evolutionary numerical optimization, real world applications, search-based software engineering, theory and more.
Current Related Events
Past Related Events
- Special Issue on Benchmarking of Computational Intelligence Algorithms in the Applied Soft Computing Journal (2018-2020, Completed)
- 2020 Good Benchmarking Practices for Evolutionary Computation (BENCHMARK@PPSN'20) Workshop
- 2020 Good Benchmarking Practices for Evolutionary Computation (BENCHMARK@GECCO'20) Workshop
- 2019 Black Box Discrete Optimization Benchmarking (BB-DOB'19) Workshop
- 2019 Special Session on Benchmarking of Evolutionary Algorithms for Discrete Optimization (BEADO'19)
- 2018 Black-Box Discrete Optimization Benchmarking (BB-DOB@PPSN'18) Workshop
- 2018 Black-Box Discrete Optimization Benchmarking (BB-DOB@GECCO'18) Workshop
- 2018 International Workshop on Benchmarking of Computational Intelligence Algorithms (BOCIA'18)