Whenever we face a situation with multiple choices, we want to pick the best one. This is true for our daily life, but also for many scenarios in industry, management, planning, design, engineering, medical services and logistics. Actually, any question for superlatives (fastest, cheapest, strongest, most valuable, ...) is an optimization problem. In this course, we want to discuss the metaheuristic way of solving these problems.
Metaheuristics are an approach to solve hard problems. A problem is hard if finding the best possible solution for them may always not be possible within feasible time. More scientifically speaking: The runtime of the best known exact algorithms for hard problems grows exponentially with the number of decision variables in the worst case, which can easily lead to billions of years for larger instances.
So how can we solve such problems?
Well, finding one solution to a problem is almost always very easy and can be done extremely fast, finding the best possible solution is what takes very long, as we discuss here here. Optimization algorithms bridge this gap: They trade in solution quality for runtime, by finding very good (but not necessarily optimal) solutions within feasible time. We explore the state-of-the-art optimization methods ranging from local searches over evolutionary computation methods and memetic algorithms to estimation of distribution algorithms. We learn that these algorithms are actually easy to understand and to program: many of the algorithms are implemented live by the teacher in the lecture in Java after describing their basic principle.
After the course, the students will have a solid practical understanding of optimization. They will be able to recognize problems where “traditional” techniques will fail (run too long) and know how to find good solutions for them within feasible time. This lecture also improves the student’s ability to program and show them that formally specified algorithms often can be translated to code in an easy, non-scary way.
Goals: We will
- learn about the basic principles of metaheuristic optimization,
- learn about several of the most prominent families of algorithms in the field,
- implement many of these algorithms, and thus
- become able to quickly design optimization software prototypes for specific applications.
Teacher: Prof. Dr. Thomas Weise
As course material, a comprehensive set of slides and examples is provided. Each course unit targets one closed topic, only building on previously introduced topics, and provides a examples and algorithm implementations in Java.
- The Structure of Optimization
- Exhaustively Enumerating All Solutions
- Random Sampling from the Space of Solutions
- Hill Climbing: A Local Search
- Random Walks
- Simulated Annealing
- Tabu Search and Iterated Local Search
- Comparing Optimization Algorithms
- Genetic (and Evolutionary) Algorithms
- Difficulties in Optimization
- Evolution Strategies
- Differential Evolution
- Genetic Programming
- Multi-Objective Optimization
- Constraint Handling
- Particle Swarm Optimization
- Ant Colony Optimization
- Estimation of Distribution Algorithms
- Memetic Algorithms
- Representations in Optimization and an Example from Logistics
- More Application Examples
To complement this course, you can read the following additional posts:
- What is Optimization?
- Why research in Computational Intelligence should be less nature-inspired.
- Why you should use Evolutionary Algorithms to solve your optimization problems (and why not).
- Are Black-Box Global Search Methods, such as Metaheuristics like Evolutionary Algorithms, better than Local Search?
- Algorithm Synthesis: Deep Learning and Genetic Programming
- Intelligent Production and Logistics: The Viewpoint of Optimization
- The in.west Project