Conversation
|
Need to add a couple of more test before PR is ready to be reviewed. Will be done in 1-2 days. |
Co-authored-by: Bethany Nicholson <blnicho@users.noreply.github.com>
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #3860 +/- ##
==========================================
+ Coverage 89.47% 89.69% +0.21%
==========================================
Files 908 909 +1
Lines 106738 106917 +179
==========================================
+ Hits 95509 95902 +393
+ Misses 11229 11015 -214
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
jsiirola
left a comment
There was a problem hiding this comment.
This is really good. I don't think that I found anything that is incorrect, but lots of opportunities to make the code cleaner / more efficient.
One general note: you create a lot of intermediate data structures as sets (usually ComponentSet -- but you never test for membership in the set, and the sets don't look like they are being used for deduplication (only iteration later). In that case it is significantly more efficient to make them lists and not hashed (set or dict) data structures.
The following is a suggestion (that I though would make this a lot simpler / easier to follow), but could probably be argued into deferring for later:
In addition to the notes below, I wonder if the transformation could be significantly simpler / easier to follow if there was a bigger refactor? Right now the transformation
- walks the whole model and collects the variables from all constraints (and objectives) and does a fair amount of preliminary error checking (e.g., verifying the completeness of the
parametrize_wrtset) - walks the model a second time collecting the constraints and sorts them into internal data structures (equality / inequality constraint lists, with bound information)
- loop through the variables and build bound expressions
- walk the model and collect objectives to create the Lagrangean
- loop through the constraint lists and add them to the Lagrangean
- loop through the variable bounds and add them to the Lagrangean
- Compute the derivative of the Lagrangean and add the stationarity constraints
- loop through the inequality constraint list and define the complementary slackness constraints
- loop through the variable bounds and define the complimentary slackness
This is a lot of repeated work (and requires making a lot of somewhat unintuitive internal data structures. I wonder if you can avoid most of these data structures and the complexity by basically reversing the order you do things:
- initialize
lagrangean = 0- var_set = ComponentSet()
- create the block to hold the KKT system (but don't add it to the model)
- 2 empty mappings (obj -> dual variable(s); dual variable -> obj)
- collect the active Objectives from the model. This should be a single-element list (if not, error).
- collect the variables form the objective into the var_set
- add the expression to the Lagrangean
- collect the active ComoponentDatas from the model; for each one:
- collect the variables form the objective into the var_set
- determine if it is an equality constraint; if yes:
- get the multiplier, add the term (gamma * (body - ub)) to the Lagrangean
- add the gamma -> con and con -> gamma to the mappings
- if no:
- convert con to
lower, body, upper - if lower is not None:
- get a multiplier, add the term (alpha1 * (lower - body)) to the Lagrangean
- add the dual / expression complimentarity condition to the model (you can use the same ComplimentarityList you were using before)
- add the alpha1 -> con to the mapping
- if upper is not None:
- get a multiplier, add the term (alpha2 * (body - upper)) to the Lagrangean
- add the dual / expression complimentarity condition to the model
- add the alpha2 -> con to the mapping
- add the con -> (alpha, alpha) to the mappings
- convert con to
- Now that we have all the variables, do the error checking on
parametrize_wrt - Loop through the variables
- get multipliers and add terms to the Lagrangean and define complimentary slackness (as with inequality constraints above)
- Now we have a full Lagrangean and the correct var_set: take the derivative and define the stationarity constraints.
Last note (that should NOT be part of this PR, but we should consider for the future): As the model is likely to go out through the NL writer, we could save on subsequent computation time by the judicious use of Expression objects. This could involve rewriting the primal constraints or making the NL writer smarter about identifying repeated expressions.
pyomo/core/plugins/transform/kkt.py
Outdated
| for con in model.component_data_objects( | ||
| Constraint, descend_into=True, active=True | ||
| ): | ||
| if con.has_lb() and con.has_ub() and (con.lower == con.upper): |
There was a problem hiding this comment.
This is a bit of a nuanced thing, but Pyomo stores the "raw" relational expression from the user in the Constraint. Asking for lower, or body, or upper forces it to "normalize" the constraint with a call to to_bounded_expression. It would be significantly more efficient to only call that once with something like:
lower, body, upper = con.to_bounded_expression()and then has_lb is equivalent to lower is not None
Testing for equality is also tricky: con.equality() is probably your best bet. It will return True for EqualityExpressions (i.e., constraints created with ==), and for RangedExpressions where lower is upper. That is a bit restrictive, but the problem with testing with == is if the bounds are "not potentially variable" expressions (i.e., expressions of mutable Params), then == will create a new equality expression that will raise an exception in your "if" (evaluating a non-constant expression in a boolean context).
In the end, I would suggest logic something like:
lower, body, upper = con.to_bounded_expression()
if con.equality():
equality_cons.append(con)
info.equality_con_to_expr[con] = con.upper - con.body
else:
if lower is not None:
inequality_cons.append((con, "lb"))
info.inequality_con_to_expr.setdefault(con, {})["lb"] = lower - body
if upper is not None:
inequality_cons.append((con, "ub"))
info.inequality_con_to_expr.setdefault(con, {})["ub"] = body - upper
if lower is not None and upper is not None:
# we want to keep track of the ranged constraints because the mapping between
# multipliers and ranged constraints will be a tuple (to indicate bound as well)
# instead of simply the model object
info.ranged_constraints.add(con)(you could, of course save a redundant comparison to None by putting the check for ranged inside one of the previous (lower / upper is not None) tests. But that might be less clear to readers...)
There was a problem hiding this comment.
Thanks for the feedback. I have tried to save a redudant comparison to None and added a comment to make it a bit more clear for users:
lower, body, upper = con.to_bounded_expression()
if con.equality:
equality_cons.append(con)
info.equality_con_to_expr[con] = upper - body
else:
if lower is not None:
inequality_cons.append((con, "lb"))
info.inequality_con_to_expr.setdefault(con, {})["lb"] = lower - body
if upper is not None:
inequality_cons.append((con, "ub"))
info.inequality_con_to_expr.setdefault(con, {})["ub"] = body - upper
# lower is not None and upper is not None -> ranged constraint
if lower is not None:
# we want to keep track of the ranged constraints because the mapping between
# multipliers and ranged constraints will be a tuple (to indicate bound as well)
# instead of simply the model object
info.ranged_constraints.add(con)
pyomo/core/plugins/transform/kkt.py
Outdated
| f"The KKT multiplier: {multiplier_var.name}, does not exist on {model.name}." | ||
| ) | ||
|
|
||
| def get_multiplier_from_constraint(self, model, constraint=None, variable=None): |
There was a problem hiding this comment.
I wonder if a different / simpler API might be preferable? I am wondering about something like:
def get_multipliers(self, model, component) -> VarData | tuple[VarData, VarData]:That is, if component is an equality or (simple) inequality, then return the multiplier. If it is a ranged inequality of a bounded variable, then return a tuple of the multipliers for the lower and upper bounds (for the case of variables, one [or I suppose both] could be None).
There was a problem hiding this comment.
I have simplified the API as per your suggestions.
Co-authored-by: John Siirola <jsiirola@users.noreply.github.com>
Co-authored-by: John Siirola <jsiirola@users.noreply.github.com>
Co-authored-by: John Siirola <jsiirola@users.noreply.github.com>
Co-authored-by: John Siirola <jsiirola@users.noreply.github.com>
Co-authored-by: John Siirola <jsiirola@users.noreply.github.com>
Thank you for these comments and suggestions. There is definitely a lot of repeated work, and the transformation can be simplified. Now that I've addressed the other issues, I will open a PR to implement this ASAP. |
…'vars_in_kkt_cons'.
Co-authored-by: John Siirola <jsiirola@users.noreply.github.com>
…mo into kkt-transform need to fix merge conflicts.
Fixes # .
Summary/Motivation:
This PR creates a Pyomo transformation for reformulating models using the KKT conditions.
Changes proposed in this PR:
Legal Acknowledgement
By contributing to this software project, I have read the contribution guide and agree to the following terms and conditions for my contribution: