Skip to content

Objective Sensitivity #282

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 14 commits into
base: master
Choose a base branch
from

Conversation

andrewrosemberg
Copy link
Collaborator

@andrewrosemberg andrewrosemberg commented Feb 22, 2025

Pull Request Summary

Title: Objective Sensitivity

Description: This pull request implements functionality for obtaining the objective sensitivity with respect to parameters and the equivalent for reverse mode in the DiffOpt.jl package.

  • Additions: 335 lines
  • Deletions: 35 lines
  • Changed Files: 7
  • Commits: 11
  • Comments: 1
  • Review Comments: 4
  • State: Open
  • Mergeable: Yes (unstable)

Usage Example

# Always a good practice to clear previously set sensitivities
DiffOpt.empty_input_sensitivities!(model)

MOI.set(model, DiffOpt.ForwardConstraintSet(), ParameterRef(p), Parameter(3.0))
MOI.set(model, DiffOpt.ForwardConstraintSet(), ParameterRef(p_c), Parameter(3.0))
DiffOpt.forward_differentiate!(model)

MOI.get(model, DiffOpt.ForwardObjectiveSensitivity())

In the backward mode, we can calculate the parameter perturbation with respect to the objective perturbation:

# Always a good practice to clear previously set sensitivities
DiffOpt.empty_input_sensitivities!(model)

MOI.set(
    model,
    DiffOpt.ReverseObjectiveSensitivity(),
    0.1,
)

DiffOpt.reverse_differentiate!(model)

MOI.get(model, DiffOpt.ReverseConstraintSet(), ParameterRef(p))

Copy link

codecov bot commented Feb 22, 2025

Codecov Report

Attention: Patch coverage is 88.88889% with 8 lines in your changes missing coverage. Please review.

Project coverage is 89.18%. Comparing base (93f058d) to head (a15f6f6).

Files with missing lines Patch % Lines
src/NonLinearProgram/NonLinearProgram.jl 83.33% 6 Missing ⚠️
src/diff_opt.jl 85.71% 1 Missing ⚠️
src/moi_wrapper.jl 88.88% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #282      +/-   ##
==========================================
+ Coverage   89.08%   89.18%   +0.10%     
==========================================
  Files          15       15              
  Lines        1969     2006      +37     
==========================================
+ Hits         1754     1789      +35     
- Misses        215      217       +2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@andrewrosemberg andrewrosemberg changed the title Implement dual of parameter anywhere [WIP] Objective Sensitivity Feb 25, 2025
@andrewrosemberg andrewrosemberg changed the title [WIP] Objective Sensitivity Objective Sensitivity Feb 27, 2025
@@ -199,6 +215,21 @@ struct ForwardConstraintDual <: MOI.AbstractConstraintAttribute end

MOI.is_set_by_optimize(::ForwardConstraintDual) = true
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

codecov thinks this line is untested, but it is used internally in the tests.

"""
struct ForwardObjectiveSensitivity <: MOI.AbstractModelAttribute end

MOI.is_set_by_optimize(::ForwardObjectiveSensitivity) = true
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dito codecov mistake

Copy link

@frapac frapac left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this PR looks good to me, overall!


```

Using Lagrandian duality we could already calculate the objective sensitivity with respect to parameters that appear in the RHS of the constraints (e.g, `cons` in this case for parameter `p`).
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo: Lagrandian -> Lagrangian

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would also mention explicitly that the objective sensitivity w.r.t. a parameter change in the RHS of the constraints is given by the optimal multiplier


Using Lagrandian duality we could already calculate the objective sensitivity with respect to parameters that appear in the RHS of the constraints (e.g, `cons` in this case for parameter `p`).

On the other hand, if the parameter appears in the LHS of the constraints, we can calculate the objective sensitivity with respect to the parameter using: the sensitivities of the variables with respect to the parameter, \( \frac{\partial x}{\partial p} \), and the gradient of the objective with respect to the variables \( \frac{\partial f}{\partial x} \):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would suggest to mention that this is a consequence of the chain-rule

MOI.get(model, DiffOpt.ReverseConstraintSet(), ParameterRef(p))
```

It is important to note that the (reverse) parameter perturbation given an objective perturbation is somewhat equivalent to the perturbation with respect to solution (since one can be calculated from the other). Therefore, one cannot set both the objective sensitivity (`DiffOpt.ReverseObjectiveSensitivity`) and the solution sensitivity (e.g. `DiffOpt.ReverseVariablePrimal`) at the same time.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what happens if we set the reverse parameter perturbation both for the objective and for the solution? Does DiffOpt returns an error, or does it keep the reverse parameter set the latest?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice catch, it returns an error, I will add a note here

MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, 1.0)

# Compute derivatives
@test_throws ErrorException DiffOpt.reverse_differentiate!(model)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

2 participants