TY - GEN
T1 - An MaOEA/Local Search Hybrid Based on a Fast, Stochastic BFGS Using Achievement Scalarizing Search Directions
AU - C.L.C. de Souza, Regina
AU - Vargas, Denis E.C.
AU - Wanner, Elizabeth
AU - Knowles, Joshua
PY - 2025/2/28
Y1 - 2025/2/28
N2 - We consider the problem of multiobjective and many-objective optimization in the unconstrained, continuous-variable setting. Can modern EAs designed for this setting (such as NSGA-III) that arguably have proven performance be improved by incorporating local search, and can this be achieved in a general way not requiring excessive tuning of parameters? Optimization in this setting is usually found to be increasingly challenging as the number of objectives is increased (albeit some works suggest the contrary) and this is believed to be because of the weakness of selection pressure available from Pareto comparisons, challenges in maintaining diversity and/or, in decomposition-based methods, due to the number of search ``directions'' that must be managed. To investigate our problem, we propose integrating a many-objective evolutionary algorithm (MaOEA) with local-search techniques based on derivative-free BFGS-like algorithms. This is done in two slightly different ways both using achievement scalarizing functions. Our results on well-known benchmark functions suggest a significant improvement is possible with reasonable assumptions about how to choose the base MaOEA parameters and a principled and general approach to choosing the remaining parameters in the hybrid algorithm. Our findings underline the effectiveness of hybrid methods and suggest powerful algorithms from mathematical programming can be used even without gradients.
AB - We consider the problem of multiobjective and many-objective optimization in the unconstrained, continuous-variable setting. Can modern EAs designed for this setting (such as NSGA-III) that arguably have proven performance be improved by incorporating local search, and can this be achieved in a general way not requiring excessive tuning of parameters? Optimization in this setting is usually found to be increasingly challenging as the number of objectives is increased (albeit some works suggest the contrary) and this is believed to be because of the weakness of selection pressure available from Pareto comparisons, challenges in maintaining diversity and/or, in decomposition-based methods, due to the number of search ``directions'' that must be managed. To investigate our problem, we propose integrating a many-objective evolutionary algorithm (MaOEA) with local-search techniques based on derivative-free BFGS-like algorithms. This is done in two slightly different ways both using achievement scalarizing functions. Our results on well-known benchmark functions suggest a significant improvement is possible with reasonable assumptions about how to choose the base MaOEA parameters and a principled and general approach to choosing the remaining parameters in the hybrid algorithm. Our findings underline the effectiveness of hybrid methods and suggest powerful algorithms from mathematical programming can be used even without gradients.
KW - Achievement Scalarizing Functions
KW - BFGS
KW - Local Search
KW - Multiobjective and Many-Objective Problems
UR - http://www.scopus.com/inward/record.url?scp=105000335620&partnerID=8YFLogxK
UR - https://link.springer.com/chapter/10.1007/978-981-96-3506-1_2
U2 - 10.1007/978-981-96-3506-1_2
DO - 10.1007/978-981-96-3506-1_2
M3 - Conference publication
AN - SCOPUS:105000335620
SN - 9789819635054
VL - 15512
T3 - Lecture Notes in Computer Science (LNCS)
SP - 17
EP - 30
BT - Evolutionary Multi-Criterion Optimization
A2 - Singh, Hemant
A2 - Ray, Tapabrata
A2 - Knowles, Joshua
A2 - Li, Xiaodong
A2 - Branke, Juergen
A2 - Wang, Bing
A2 - Oyama, Akira
PB - Springer
T2 - 13th International Conference on Evolutionary Multi-Criterion Optimization, EMO 2025
Y2 - 4 March 2025 through 7 March 2025
ER -