The forth Black-Box Optimization Benchmarking workshop took place in Amsterdam on July 06, 2013, as part of GECCO 2013.
Benchmarking of optimization algorithms is crucial to assess performance of optimizers quantitatively, understand weaknesses and strengths of each algorithm and is a compulsory path to evaluate new algorithm designs. However, this task turns out to be tedious and difficult to realize even in the single-objective case—at least if one is willing to accomplish it in a scientifically decent and rigorous way. The BBOB 2013 workshop for real-parameter optimization, a follow-up of previous editions in 2009, 2010, and 2012, will furnish most of these tedious tasks for its participants:
For this new edition, we provide essentially the same test-suite as in the previous years and welcome all high-quality contributions about benchmarking and comparing general numerical optimizers for black-box problems for which the data from all previously benchmarked algorithms is made available on this web page. In addition, we would like to place further emphasis on expensive optimization and collect data of optimizers that are especially tailored for problems where only a small budget of function evaluations is affordable.
Source code for running experiments in different languages (C, Matlab, Java, R, Python) and postprocessing data will be provided as in previous years. What remains to be done for the participants is to allocate CPU-time, run the black-box real-parameter optimizer(s) of their interest in different dimensions a few hundreds of times and finally start the post-processing procedure. Two testbeds are provided,
for which we distinguish between an expensive optimization scenario (where a focus on the first 100D function evaluations is assumed) and a general scenario—compatible with the comparison plots of the previous years.
The participants can freely choose any or all of them. This new edition entirely bans different parameter settings for different test functions and encourages analyses that study the impact of parameter setting changes.
During the workshop, algorithms and results will be presented by the participants. An overall analysis and comparison will be accomplished by the organizers and the overall process will be critically reviewed. A plenary discussion on future improvements will, among others, address the question, of how the testbed should evolve.
The source code of the test-functions is available in Matlab, C, Java, R and Python. Downloads will be made available at BBOB 2013 downloads.
We encourage any submission that is concerned with black-box optimization benchmarking of continuous optimizers, for example, benchmarking new or not-so-new algorithms on the BBOB-2013 testbed (which have not been tested in BBOB-2009, BBOB-2010, or BBOB-2012) or analyzing the data obtained in BBOB-2009/BBOB-2010/BBOB-2012 or… This year, we also focus on benchmarking optimization algorithms for expensive optimization which especially invites to benchmark surrogate-assisted algorithms (e.g. based on kriging, support vector machines etc.).
Several templates, that comply with the ACM guidelines, are going to be provided:
Experimental setup documents, code for the benchmark functions (Matlab, C, Java, R, Python) and for the post-processing (Python) are provided at the download page. To be notified about the release of the code, subscribe to the feed http://coco.gforge.inria.fr/feed.php and / or subscribe to the announcement list by sending an email to bbob _at_ lri.fr
In order to prepare linking an optimizer to the BBOB-framework code and running first tests, the currently available downloads can be used. The code for the final experiment and post-processing obey the very same interface are released soon (see above).
You can subscribe (or unsubscribe) to our discussion mailing list by following this link http://lists.lri.fr/cgi-bin/mailman/listinfo/bbob-discuss
To receive announcement about the workshop, send an email to the BBOB team bbob_at_lri.fr with title “register to BBOB announcement list”.
Anne Auger, Bernd Bischl, Dimo Brockhoff, Nikolaus Hansen, Olaf Mersmann, Petr Pošík, Heike Trautmann