Skip to content

Final report for Google summer of Code 2024 for multi objective optimization

Notifications You must be signed in to change notification settings

IWNMWE/MOO-GSoC-24

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 

Repository files navigation

GSoC mlpack image

Organization: mlpack

Project: Adding NSGA-3 and AGE-MOEA Multi Objective Optimizers

Mentors: Marcus Edel and Omar Shrit

Here are the outcomes achieved by the optimizers and test suites I developed this summer.

NSGA3 On DTLZ1 AGEMOEA On Fleming AGEMOEA On ZDT3
NSGA3 On DTLZ1 AGEMOEA On Fleming AGEMOEA on ZDT3
MOEAD On MAF1 MOEAD On MAF3 AGEMOEA On ZDT1
MOEAD On MAF1 MOEAD On MAF3 AGEMOEA on ZDT1

Project Overview.

Multi-objective optimization is crucial across various fields as it enables the balancing of conflicting goals, such as performance versus cost in engineering or profit versus impact in economics. This approach leads to more comprehensive and effective solutions, better suited to complex real-world challenges.

This project aimed to enhance ensmallen's capabilities in multi objective optimization by implementing optimizers such as AGE-MOEA and NSGA-3 and various problems from the MAF Benchmark test suite.

Project Objectives

  • Implement the AGE-MOEA optimizer algorithm. Annibale Panichella
  • Implement the NSGA-3 optimizer algorithm. Deb et al., 2013
  • Implement the MAF benchmark test suite. Cheng et al., 2017
  • Implement the Inverse Generational Distance metric.
  • Conduct thorough testing to ensure the reliability and effectiveness of the implemented algorithms.
  • Provide comprehensive documentation for the newly added optimizers and test suite.

Current State of the project.

AGE-MOEA and IGD performance indicator have been fully implemented and merged into the repo along with their respective documentation and tests.

MAF has been fully implemented and tested. Some of the MAF problems have also been added as tests for the optimizers.The PR containing the MAF changes have been approved and are ready to merge.

Simulated Binary Crossover (SBX) has been implemented as a part of the AGEMOEA implementation PR and Hyperplane Normalization has been implemented as a separate class in the NSGA-3 implementation PR.

NSGA-3 has been implemented but needs to undergo testing and further investigation regarding its slow convergence to the true Pareto front.

Here is the list of PRs (both open & closed) that I created during GSoC:

Pull Request Title Status
Adding AGEMOEA and IGD #399 🟪 Merged
AGEMOEA for Convex Fronts #407 🟪 Merged
Adding the MAF Benchmark problems #402 🟪 Merged
Adding NSGA3 #406 🟩 Open

Challenges

There were serveral challenges faced throughout all stages of this project:

Debugging Complexity: Debugging these optimizers turned out to be the biggest challenge faced during the whole project.With many moving parts in each optimizer it required thorough code walkthroughs and testing to debug any errors or discrepancies in the output.

Validating Output with reference implementations: Each algorithm had various ways to implement some of the components, and the results of other libraries depended on which variation of the components was implemented.

Future Enhancements and Direction

Looking ahead, there are a few areas that could further enhance the project:

Improve the runtime of the optimizers: There are various area in which the runtime of the multiobjective optimizers can be improved leading to more usability(The initial step for this would be to benchmark all the algorithms on a common set of problems).

Abstracting the offspring generation and selection procedure: This allows for easy testing of custom or different offspring generation and selection procedures.

Acknowledgements

I would like to thank the mlpack community especially my mentors Marcus Edel and Omar Shrit.Their expertise and insight helped shape my understanding and the regular code reviews really helped me to stick to the timeline.

About

Final report for Google summer of Code 2024 for multi objective optimization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published