Whenever we face a situation with multiple choices, we want to pick the best one. This is true for our daily life, but also for many scenarios in industry, management, planning, design, engineering, medical services and logistics. Actually, any question for a superlative (fastest, cheapest, strongest, most valuable, ...) is an optimization problem. In this course, we want to discuss the metaheuristic way of solving these problems.

Metaheuristics are an approach to solve hard problems. A problem is hard if finding the best possible solution for it may not always be possible within feasible time. More scientifically speaking: The worst-case runtime of the best known exact algorithms for hard problems grows exponentially with the number of decision variables, which can easily lead to billions of years for larger problem instances.

So how can we solve such problems?

Well, finding one solution to a problem is almost always very easy and can be done extremely fast, finding the best possible solution is what takes very long (see also here). Optimization algorithms bridge this gap: They trade in solution quality for runtime, by finding very good (but not necessarily optimal) solutions within feasible time. We explore the state-of-the-art optimization methods ranging from local searches over evolutionary computation methods and memetic algorithms to estimation of distribution algorithms. We learn that these algorithms are actually easy to understand and to program: many of the algorithms are implemented live by the teacher in the lecture in Java after describing their basic principle.

After the course, the students will have a solid practical understanding of optimization. They will be able to recognize problems where “traditional” techniques will fail (e.g., run too long) and know how to find good solutions for them within feasible time. This lecture also improves the student’s ability to write programs and shows them that formally specified algorithms often can be translated to code in an easy, non-scary way.

Prerequisites: Java Programming (e.g., learned in our course Object-Oriented Programming with Java)

Goals: We will

  1. learn about the basic principles of metaheuristic optimization,
  2. learn about several of the most prominent families of algorithms in the field,
  3. implement many of these algorithms, and thus
  4. become able to quickly design optimization software prototypes for specific applications.

Teacher: Prof. Dr. Thomas Weise

Course Material

As course material, a comprehensive set of slides and examples is provided. Each course unit targets one closed topic, only building on previously introduced topics, and provides a examples and algorithm implementations in Java.

All source codes of all example algorithm implementations and problem examples can be found in this tar.xz archive (sources.tar.xz).

A tar.xz archive with all the teaching material (slides, examples) of this course can be found here.

  1. Introduction
  2. The Structure of Optimization
  3. Exhaustively Enumerating All Solutions
  4. Random Sampling from the Space of Solutions
  5. Hill Climbing: A Local Search
  6. Random Walks
  7. Simulated Annealing
  8. Tabu Search and Iterated Local Search
  9. Comparing Optimization Algorithms
  10. Genetic (and Evolutionary) Algorithms
  11. Difficulties in Optimization
  12. Evolution Strategies
  13. Differential Evolution
  14. Genetic Programming
  15. Multi-Objective Optimization
  16. Constraint Handling
  17. Particle Swarm Optimization
  18. Ant Colony Optimization
  19. Estimation of Distribution Algorithms
  20. Memetic Algorithms
  21. Representations in Optimization and an Example from Logistics
  22. More Application Examples
  23. Linear Programming

To complement this course, you can read the following additional posts:

  1. r/CompIntellCourses/: a subreddit with education material on Computational Intelligence, managed by the University Curricula Committee of the IEEE Computational Intelligence Society (IEEE CIS), also listing our course here.
  2. What is Optimization?
  3. Why research in Computational Intelligence should be less nature-inspired.
  4. Why you should use Evolutionary Algorithms to solve your optimization problems (and why not).
  5. Are Black-Box Global Search Methods, such as Metaheuristics like Evolutionary Algorithms, better than Local Search?
  6. Algorithm Synthesis: Deep Learning and Genetic Programming
  7. Intelligent Production and Logistics: The Viewpoint of Optimization
  8. It should be noted that optimization algorithms are one of the key enabling technologies for currently trending concepts such as Made in China 2025 [中国制造2025] and Industry 4.0., as discussed here.
  9. The in.west Project