Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🚀 Add NonLinearProgram Support to DiffOpt.jl #260

Open
wants to merge 21 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "DiffOpt"
uuid = "930fe3bc-9c6b-11ea-2d94-6184641e85e7"
authors = ["Akshay Sharma", "Mathieu Besançon", "Joaquim Dias Garcia", "Benoît Legat", "Oscar Dowson"]
version = "0.4.2"
version = "0.5.0"

[deps]
BlockDiagonals = "0a1fb500-61f7-11e9-3c65-f5ef3456f9f0"
Expand Down
8 changes: 4 additions & 4 deletions docs/src/index.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# DiffOpt.jl

[DiffOpt.jl](https://github.com/jump-dev/DiffOpt.jl) is a package for differentiating convex optimization program ([JuMP.jl](https://github.com/jump-dev/JuMP.jl) or [MathOptInterface.jl](https://github.com/jump-dev/MathOptInterface.jl) models) with respect to program parameters. Note that this package does not contain any solver.
[DiffOpt.jl](https://github.com/jump-dev/DiffOpt.jl) is a package for differentiating convex and non-convex optimization program ([JuMP.jl](https://github.com/jump-dev/JuMP.jl) or [MathOptInterface.jl](https://github.com/jump-dev/MathOptInterface.jl) models) with respect to program parameters. Note that this package does not contain any solver.
This package has two major backends, available via the `reverse_differentiate!` and `forward_differentiate!` methods, to differentiate models (quadratic or conic) with optimal solutions.

!!! note
Currently supports *linear programs* (LP), *convex quadratic programs* (QP) and *convex conic programs* (SDP, SOCP, exponential cone constraints only).
Currently supports *linear programs* (LP), *convex quadratic programs* (QP), *convex conic programs* (SDP, SOCP, exponential cone constraints only), and *general nonlinear programs* (NLP).


## Installation
Expand All @@ -16,8 +16,8 @@ DiffOpt can be installed through the Julia package manager:

## Why are Differentiable optimization problems important?

Differentiable optimization is a promising field of convex optimization and has many potential applications in game theory, control theory and machine learning (specifically deep learning - refer [this video](https://www.youtube.com/watch?v=NrcaNnEXkT8) for more).
Recent work has shown how to differentiate specific subclasses of convex optimization problems. But several applications remain unexplored (refer section 8 of this [really good thesis](https://github.com/bamos/thesis)). With the help of automatic differentiation, differentiable optimization can have a significant impact on creating end-to-end differentiable systems to model neural networks, stochastic processes, or a game.
Differentiable optimization is a promising field of constrained optimization and has many potential applications in game theory, control theory and machine learning (specifically deep learning - refer [this video](https://www.youtube.com/watch?v=NrcaNnEXkT8) for more).
Recent work has shown how to differentiate specific subclasses of constrained optimization problems. But several applications remain unexplored (refer section 8 of this [really good thesis](https://github.com/bamos/thesis)). With the help of automatic differentiation, differentiable optimization can have a significant impact on creating end-to-end differentiable systems to model neural networks, stochastic processes, or a game.


## Contributing
Expand Down
13 changes: 8 additions & 5 deletions docs/src/manual.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,5 @@
# Manual

!!! note
As of now, this package only works for optimization models that can be written either in convex conic form or convex quadratic form.


## Supported objectives & constraints - scheme 1

For `QPTH`/`OPTNET` style backend, the package supports following `Function-in-Set` constraints:
Expand All @@ -16,6 +12,12 @@ For `QPTH`/`OPTNET` style backend, the package supports following `Function-in-S
| `ScalarAffineFunction` | `GreaterThan` |
| `ScalarAffineFunction` | `LessThan` |
| `ScalarAffineFunction` | `EqualTo` |
| `ScalarQuadraticFunction` | `GreaterThan` |
| `ScalarQuadraticFunction` | `LessThan` |
| `ScalarQuadraticFunction` | `EqualTo` |
| `ScalarNonlinearFunction` | `GreaterThan` |
| `ScalarNonlinearFunction` | `LessThan` |
| `ScalarNonlinearFunction` | `EqualTo` |

and the following objective types:

Expand All @@ -24,6 +26,7 @@ and the following objective types:
| `VariableIndex` |
| `ScalarAffineFunction` |
| `ScalarQuadraticFunction` |
| `ScalarNonlinearFunction` |


## Supported objectives & constraints - scheme 2
Expand Down Expand Up @@ -71,7 +74,7 @@ DiffOpt requires taking projections and finding projection gradients of vectors
## Conic problem formulation

!!! note
As of now, the package is using `SCS` geometric form for affine expressions in cones.
As of now, when defining a conic or convex quadratic problem, the package is using `SCS` geometric form for affine expressions in cones.

Consider a convex conic optimization problem in its primal (P) and dual (D) forms:
```math
Expand Down
2 changes: 1 addition & 1 deletion docs/src/reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,5 +4,5 @@
```

```@autodocs
Modules = [DiffOpt, DiffOpt.QuadraticProgram, DiffOpt.ConicProgram]
Modules = [DiffOpt, DiffOpt.QuadraticProgram, DiffOpt.ConicProgram, DiffOpt.NonLinearProgram]
```
35 changes: 35 additions & 0 deletions docs/src/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,3 +56,38 @@ MOI.set(model, DiffOpt.ForwardObjectiveFunction(), ones(2) ⋅ x)
DiffOpt.forward_differentiate!(model)
grad_x = MOI.get.(model, DiffOpt.ForwardVariablePrimal(), x)
```

3. To differentiate a general nonlinear program, we can use the `forward_differentiate!` method with perturbations in the objective function and constraints through perturbations in the problem parameters. For example, consider the following nonlinear program:
```julia
model = Model(() -> DiffOpt.diff_optimizer(Ipopt.Optimizer))
@variable(model, p ∈ MOI.Parameter(0.1))
@variable(model, x >= p)
@variable(model, y >= 0)
@objective(model, Min, x^2 + y^2)
@constraint(model, con, x + y >= 1)

# Solve
JuMP.optimize!(model)

# Set parameter pertubations
MOI.set(model, DiffOpt.ForwardParameter(), params[1], 0.2)

# forward differentiate
DiffOpt.forward_differentiate!(model)

# Retrieve sensitivities
dx = MOI.get(model, DiffOpt.ForwardVariablePrimal(), x)
dy = MOI.get(model, DiffOpt.ForwardVariablePrimal(), y)
```

or we can use the `reverse_differentiate!` method:
```julia
# Set Primal Pertubations
MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, 1.0)

# Reverse differentiation
DiffOpt.reverse_differentiate!(model)

# Retrieve reverse sensitivities (example usage)
dp= MOI.get(model, DiffOpt.ReverseParameter(), p)
```
2 changes: 2 additions & 0 deletions src/DiffOpt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ include("bridges.jl")

include("QuadraticProgram/QuadraticProgram.jl")
include("ConicProgram/ConicProgram.jl")
include("NonLinearProgram/NonLinearProgram.jl")

"""
add_all_model_constructors(model)
Expand All @@ -35,6 +36,7 @@ Add all constructors of [`AbstractModel`](@ref) defined in this package to
function add_all_model_constructors(model)
add_model_constructor(model, QuadraticProgram.Model)
add_model_constructor(model, ConicProgram.Model)
add_model_constructor(model, NonLinearProgram.Model)
return
end

Expand Down
Loading