-
Notifications
You must be signed in to change notification settings - Fork 14
Objective Sensitivity #282
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Objective Sensitivity #282
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #282 +/- ##
==========================================
+ Coverage 89.08% 89.18% +0.10%
==========================================
Files 15 15
Lines 1969 2006 +37
==========================================
+ Hits 1754 1789 +35
- Misses 215 217 +2 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
@@ -199,6 +215,21 @@ struct ForwardConstraintDual <: MOI.AbstractConstraintAttribute end | |||
|
|||
MOI.is_set_by_optimize(::ForwardConstraintDual) = true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
codecov thinks this line is untested, but it is used internally in the tests.
""" | ||
struct ForwardObjectiveSensitivity <: MOI.AbstractModelAttribute end | ||
|
||
MOI.is_set_by_optimize(::ForwardObjectiveSensitivity) = true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dito codecov mistake
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this PR looks good to me, overall!
|
||
``` | ||
|
||
Using Lagrandian duality we could already calculate the objective sensitivity with respect to parameters that appear in the RHS of the constraints (e.g, `cons` in this case for parameter `p`). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
typo: Lagrandian
-> Lagrangian
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would also mention explicitly that the objective sensitivity w.r.t. a parameter change in the RHS of the constraints is given by the optimal multiplier
|
||
Using Lagrandian duality we could already calculate the objective sensitivity with respect to parameters that appear in the RHS of the constraints (e.g, `cons` in this case for parameter `p`). | ||
|
||
On the other hand, if the parameter appears in the LHS of the constraints, we can calculate the objective sensitivity with respect to the parameter using: the sensitivities of the variables with respect to the parameter, \( \frac{\partial x}{\partial p} \), and the gradient of the objective with respect to the variables \( \frac{\partial f}{\partial x} \): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would suggest to mention that this is a consequence of the chain-rule
MOI.get(model, DiffOpt.ReverseConstraintSet(), ParameterRef(p)) | ||
``` | ||
|
||
It is important to note that the (reverse) parameter perturbation given an objective perturbation is somewhat equivalent to the perturbation with respect to solution (since one can be calculated from the other). Therefore, one cannot set both the objective sensitivity (`DiffOpt.ReverseObjectiveSensitivity`) and the solution sensitivity (e.g. `DiffOpt.ReverseVariablePrimal`) at the same time. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what happens if we set the reverse parameter perturbation both for the objective and for the solution? Does DiffOpt returns an error, or does it keep the reverse parameter set the latest?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice catch, it returns an error, I will add a note here
MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, 1.0) | ||
|
||
# Compute derivatives | ||
@test_throws ErrorException DiffOpt.reverse_differentiate!(model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice!
Pull Request Summary
Title: Objective Sensitivity
Description: This pull request implements functionality for obtaining the objective sensitivity with respect to parameters and the equivalent for reverse mode in the
DiffOpt.jl
package.Usage Example
In the backward mode, we can calculate the parameter perturbation with respect to the objective perturbation: