Skip to content

Conversation

SebastianM-C
Copy link
Contributor

@SebastianM-C SebastianM-C commented May 17, 2025

This pacakge directly uses the C interface in Ipopt.jl.
The implementation is based on OptimizationMOI, but it also adds Ipopt specific elements, such as the callback handling.

This PR is an initial draft and needs more work.

The advantage of this approach over OptimizationMOI is that

  • we get correct, solver specific callback handling
  • support for common maxiters, maxtime, reltol arguments
  • slightly lower loading times by not needing to load MOI

While I copied a bit of the code for handling of symbolic systems since I'm not sure I understand how it's working yet and why do we essentially re-implement SymbolicUtils' codegen.

I think that some of the functionality of OptimizationMOI might be common enought that can be moved to an upstream pacakge to avoid code duplication.

I tried to give credit to the autohors of the funtionality that I copied over from OptimizationMOI by includding them as co-authors on the commit. Let me know if that's okay.

Checklist

  • Appropriate tests were added
  • Any code changes were done in a way that does not break public API
  • All documentation related to code changes were updated
  • The new code follows the
    contributor guidelines, in particular the SciML Style Guide and
    COLPRAC.
  • Any new documentation only uses public API

Additional context

Add any other context about the problem here.

SebastianM-C and others added 13 commits August 22, 2025 11:48
This pacakge directly uses the C interface in Ipopt.jl.
The implementation is based on OptimizationMOI,
but it also adds Ipopt specific elements, such as the
callback handling.

Co-authored-by: Vaibhav Dixit <[email protected]>
Co-authored-by: Valentin Kaisermayer <[email protected]>
Co-authored-by: Fredrik Bagge Carlson <[email protected]>
Co-authored-by: Oscar Dowson <[email protected]>
…mples

- Add tests inspired by Ipopt C++ examples (recursive NLP, MyNLP, Luksan-Vlcek problems)
- Add tests for various optimization problem types:
  * Optimal control problems
  * Portfolio optimization
  * Geometric programming
  * Parameter estimation/curve fitting
  * Network flow problems
  * Robust optimization
- Add tests for advanced Ipopt features:
  * Custom tolerances and convergence criteria
  * Different linear solvers and scaling options
  * Barrier parameter (mu) strategies
  * Fixed variable handling
  * Derivative testing
- Add tests for different Hessian approximation methods (BFGS, SR1)
- Test warm start capabilities
- Add stress tests for high-dimensional and highly nonlinear problems
- Update OptimizationIpopt to support passing arbitrary Ipopt options via kwargs
- Use correct wrapper parameter names (verbose, maxiters) instead of Ipopt internals
- Add documentation for the test suite

This significantly improves test coverage and ensures the wrapper properly handles
various problem types and Ipopt-specific features.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
@SebastianM-C SebastianM-C changed the title [WIP] add initial version of OptimizationIpopt Add initial version of OptimizationIpopt Aug 22, 2025
@SebastianM-C SebastianM-C marked this pull request as ready for review August 22, 2025 12:02
@SebastianM-C
Copy link
Contributor Author

@ChrisRackauckas the tests are passing now: https://github.com/SciML/Optimization.jl/actions/runs/17166591928/job/48708247846?pr=915

I think this should be ready, let me know what you think.

return inds
end

function eval_hessian_lagrangian(cache::IpoptCache{T},
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this seems like the wrong spot to go into this detail here... but can do it for now

Comment on lines +39 to +40
u, z_L, z_U = zeros(n), zeros(n), zeros(n)
g, lambda = zeros(m), zeros(m)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would be good to make these cached in the future

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we cache them though, we would not be able to record the evolution. From what I've seem in other optimizers, the internal metrics are not cached and you can record the evolution instead of needing a specialized callback. I'll have to see what's the performance impact, but I think that if you want to log these, you'd have to allocate new vectors anyway.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes but you don't need to always allocate, it can just modify in-place and then if someone wants to log then allocate.

@ChrisRackauckas ChrisRackauckas merged commit 8f8e4ff into SciML:master Aug 25, 2025
49 of 71 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants