Yesterday, we published a new course titled "Metaheuristic Optimization". On the course website, you can find all slides and example algorithm implementations and you can download the complete course material as a single tar.xz archive (can be opened under Windows and Linux) from here. As the third completely published course of our institute, after the courses on "Object-Oriented Programming with Java" and "Distributed Computing", this marks another milestone in our strive for teaching excellence.

The new course tries to give a complete and yet in-depth overview on the topic of optimization from the perspective of metaheuristics, i.e., approximate algorithms that can find good solutions for computational hard problems within a short time. The course is designed to require little background knowledge. Many of the presented algorithms can directly be implemented during the course in Java by the instructor within a few minutes. This shows that the algorithms we discuss are not scary and can be mastered even with basic programming knowledge. It also closes the gap between research and practice – after all, we are the Institute of Applied Optimization.

In the course, we discuss a broad spectrum of different optimization methods, ranging from local search algorithms such as hill climbers, Simulated Annealing, and Tabu Search to global search methods such as Evolutionary Algorithms, Evolution Strategies, Genetic Programming, Differential Evolution, Particle Swarm Optimization, Ant Colony Optimization, and Estimation of Distribution Algorithms. We also discuss several phenomena that make problems difficult for these algorithms as well as general concepts such as multi-objective and constraint optimization as well as several example applications. All in all, this course aims to give the student the knowledge to recognize an optimization problem when she sees it, the ability to choose the right algorithm for the right problem, together with the practical experience to implement and apply said algorithm in a short time.

Benchmarking, the empirical algorithm performance comparison, is usually the only feasible way to find which algorithm is good for a given problem. Benchmarking consists of two steps: First, the algorithms are applied to the benchmarking problems and data is collected. Second, the collected data is evaluated. There is little guidance for the first and a lack of tool support for the second step. Researchers investigating new problems need to implement both data collection and evaluation. In our new paper "From Standardized Data Formats to Standardized Tools for Optimization Algorithm Benchmarking," we want to make the case for defining standard directory structures and file formats for the performance data and metadata of experiments with optimization algorithms. Such formats must be easy to read, write, and to incorporate into existing setups. If there are commonly accepted formats and researchers would actually use them, then this would allow more general tools to emerge. There would be real incentive for everyone who makes an evaluation tool to use the common format right away and then, maybe, publish their tool for others to use. Then, researchers then would no longer need to implement their own evaluation programs. We try to derive suitable formats by analyzing what existing tools do and what information they need. We then present a general tool, our optimizationBenchmarking.org framework, including an open source library for reading and writing data in our format. Since our framework obtains its data from a general file format, it can assess the performance of arbitrary algorithms implemented in arbitrary programming languages on arbitrary single-objective optimization problems.

We have two accepted papers at the Genetic and Evolutionary Computation Conference (GECCO'17), which will take place from July 15 to 19, 2017 in Berlin, Germany:

  • Weichen Liu, Thomas Weise, Yuezhong Wu, and Qi Qi. Combining Two Local Searches with Crossover: An Efficient Hybrid Algorithm for the Traveling Salesman Problem. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO'17), July 15-19, 2017, Berlin, Germany, New York, NY, USA: ACM Press, pages 298-305, ISBN: 978-1-4503-4920-8.
    doi:10.1145/3071178.3071201 / paper / slides
  • Qi Qi, Thomas Weise, and Bin Li. Modeling Optimization Algorithm Runtime Behavior and its Applications. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO'17) Companion, July 15-19, 2017, Berlin, Germany, New York, NY, USA: ACM Press, pages 115-116, ISBN: 978-1-4503-4939-0.
    doi:10.1145/3067695.3076042 / paper / poster

GECCO is the prime event for research in Evolutionary Computation and organized yearly by SIGEVO, the ACM Special Interest Group on Genetic and Evolutionary Computation. We are looking forward to meeting you at the conference ^_^

The Institute of Applied Optimization welcomes Ms. Bing Zhou [周冰], who today has joined us as administrative secretary [行政秘书]. She has previously studied English here at the Hefei University [合肥学院]. She took part very successfully in English language competitions and has gained experience via several voluntary activities. She will not just support us as translator and secretary, but also actively contribute to the administration and management of this institute.

We are looking forward to working together.

Today, we released the new version 0.8.9 of our optimizationBenchmarking.org software framework for automating significant parts of the research in the fields of optimization, machine learning, and operations research. This new version comes much closer to our goal to reduce the workload of researchers and practitioners during the development of algorithms to allow them to spend more time on thinking.

Besides all the functionality offered by the previous releases, it introduces a new process for obtaining high-level conclusions about problem hardness and algorithm behaviors. This process takes the raw data from experiments together with meta-information about algorithm setups and problem instances as input. It applies a sequence of machine learning technologies, namely curve fitting, clustering, and classification, to find which features make a problem instance hard and which algorithm setup parameters cause which algorithm behavior. We just submitted an article about this new process for review.

Our software provides a set of very general tools for algorithm performance analysis (e.g., plotting runtime/quality and ECDF charts) as well as our new process. Since it takes data in form of text files, it can analyze the results of any optimization algorithm implemented in any programming language applied to any optimization problem. It produces human-readable reports either in form of LaTeX/PDF documents or as XHTML.

The software provides a user-friendly, web-based GUI which runs either on your local machine or a server in your lab comes. The software comes in three flavors: 

  1. as Java executable, requiring that several tools are installed (Java, R with several packages, a LaTeX system installation),
  2. as Docker image, which only requires an installation of Docker. It can be started directly under Linux, Windows, and Mac OS with the single command docker run -t -i -p 9999:8080/tcp optimizationbenchmarking/evaluator-gui and then is used by browsing to http://localhost:9999. (At first start, the image is downloaded), and
  3. as command line program without GUI for integration in other software environments (with the same installation requirements as the GUI),

feed-image rss feed-image atom