User Rating: 4 / 5

Star ActiveStar ActiveStar ActiveStar ActiveStar Inactive
 

Our friends from the Nevergrad and the IOHprofiler are organizing the Open Optimization Competition 2020 and welcome contributions.

Nevergrad is an open source platform for derivative-free optimization developed by Facebook Artificial Intelligence Research, Paris, France. It contains a wide range of optimization algorithm implementations, test cases, supports multi-objective optimization, and handles constraints. It automatically updates results of all experiments merged in the code base, hence users do not need computational power for participating and getting results. IOHprofiler is a tool for benchmarking iterative optimization heuristics such as local search variants, evolutionary algorithms, model-based algorithms, and other sequential optimization techniques. It has two components: the IOHexperimenter for running empirical evaluations and the IOHanalyzer for the statistical analysis and visualization of the experiment data. It is mainly developed by teams at Leiden University in the Netherlands, the Sorbonne University and CNRS in Paris, France, and the Tel Hai College in Israel.

This competition is part of the wider aim to build open-source, user-friendly, and community-driven platforms for comparing different optimization techniques. The key principles are reproducibility, open source, and ease of access. While some first steps towards such platforms have been done, the tools can greatly benefit from the contributions of the various communities for whom they are built. Hence, the goal of this competition is to solicit contributions towards these goals from the community.

1. Areas of Interest

In the competition, all kinds of submissions are welcome, including, but not limited to:

  • Performance: Improvements for any of the following performance tracks (classic competition, with existing baselines that are meant to be outperformed)
    • one-shot optimization,
    • low budget optimization,
    • multi-objective optimization,
    • discrete optimization,
    • structured optimization,
    • constrained optimization,
    We explicitly allow portfolio-based algorithms such as landscape-aware algorithm selection techniques, etc.
  • New ideas & others: contributions to the experimental routines, including
    • suggestions for performance measures and statistics,
    • good and new benchmarking problems (structured optimization, new classes of problems, real world problems),
    • modular algorithm frameworks,
    • visualization of data,
    • criteria in benchmarking, e.g. robustness criteria over large families of problems,
    • cross-validation issues in optimization benchmarking,
    • reproducibility,
    • statistics,
  • Software contributions, including
    • distribution over clusters or grids,
    • parallelization,
    • software contribution in general,
    • mathematically justified improvements,

2. Awards

Up to 12'000 euros of awards, to be used for traveling to PPSN or GECCO 2020 or 2021, distributed over several winners. In addition, a limited number of registration fee waivers are available for PPSN 2020. The awards will be separated in two tracks:

  • Performance: making algorithms better or designing better algorithms; and
  • Contributions: everything people can think of, which makes the platform better for users and for science.

3. Submission Process

All submissions are based on pull request, which are directly made to either one of the tools, via

All supporting material should be uploaded together with the pull request. Links to arXiv papers etc are possible and welcome, but by no means mandatory. Keep in mind that a good description of your contribution increases the chance that jury members will understand and value your contribution. All pull requests not yet merged on December 1, 2019 and opened before June 1, 2020 are eligible for the competition. The key principles are:

  • Open source: no matter how good the results are, if they can not be reproduced or the code can not be checked, this is not in the scope.
  • Reproducibility: if the code can not be run easily, it is also not in the scope.

4. Contact and Competition Page

In case of questions, please do not hesitate to contact the organizers of the competition. Please send all inquiries to Carola (This email address is being protected from spambots. You need JavaScript enabled to view it.) and Olivier (This email address is being protected from spambots. You need JavaScript enabled to view it.), who will coordinate your request. The competition page is https://github.com/facebookresearch/nevergrad/blob/master/docs/opencompetition2020.md.

5. Organizers

  • Thomas Bäck (Leiden University, The Netherlands),
  • Carola Doerr (CNRS, Sorbonne Université, Paris, France),
  • Antoine Moreau (Université Clermont Auvergne),
  • Jeremy Rapin (Facebook Artificial Intelligence Research, Paris, France),
  • Baptiste Roziere (Facebook Artificial Intelligence Research, Paris, France),
  • Ofer M. Shir (Tel-Hai College and Migal Institute, Israel), and
  • Olivier Teytaud (Facebook Artificial Intelligence Research, Paris, France).

6. Award Committee Members

  • Enrique Alba (University of Málaga, Spain),
  • Maxim Buzdalov (ITMO University, Russia),
  • Josu Cerebio (University of the Basque country, Spain),
  • Benjamin Doerr (Ecole Polytechnique, France),
  • Tobias Glasmachers (Ruhr-Universität Bochum, Germany),
  • Manuel Lopez-Ibanez (University of Manchester, UK),
  • Katherine Mary Malan (University of South Africa),
  • Luís Paquete (University of Coimbra, Portugal),
  • Jan van Rijn (Leiden University, The Netherlands),
  • Marc Schoenauer (Inria Saclay, France) and
  • Thomas Weise (Hefei University, China).