You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be nice if Theseus supported bound-constrained non-linear least-squares optimisation. For feature parity with scipy.optimize.least_squares on this topic, this would mean
implementing thedogbox variant of the Powell's dogleg optimiser
augmenting the trust region approach to provide a Trust Region Reflective (trf) method
Motivation
Theseus already provides Trust Region and Dogleg optimisers. Hopefully these can serve as strong basis for extensions to dogbox and trf.
Pitch
Having support for various constrained optimisation approaches in Theseus would be fantastic. Bound constraints is probably covering a large proportion of user needs in this space.
Alternatives
Let the user to:
Add soft constraints through adding terms to the objective function
Add hard unreachable bounds through reparameterisation with sigmoid / softplus functions
Additional context
A previous feature request related to this can be found here #484. I'm not sure why it was closed.
An experimental PR for equality constraints can be found here: #457.
dogbox reference:
C. Voglis and I. E. Lagaris, “A Rectangular Trust Region Dogleg Approach for Unconstrained and Bound Constrained Nonlinear Optimization”, WSEAS International Conference on Applied Mathematics, Corfu, Greece, 2004. https://www.cs.uoi.gr/~lagaris/papers/PREPRINTS/dogbox.pdf
trf reference:
M. A. Branch, T. F. Coleman, and Y. Li, “A Subspace, Interior, and Conjugate Gradient Method for Large-Scale Bound-Constrained Minimization Problems,” SIAM Journal on Scientific Computing, Vol. 21, Number 1, pp 1-23, 1999. https://doi.org/10.1137/S1064827595289108
An anlternative to dogbox and trf mentioned in Ceres Solver is to use a projected LM approach with line-search as detailled in:
C. Kanzow, N. Yamashita and M. Fukushima, Levenberg-Marquardt methods with strong local convergence properties for solving nonlinear equations with convex constraints, Journal of Computational and Applied Mathematics, 172(2):375-397, 2004. https://doi.org/10.1016/j.cam.2004.02.013
The text was updated successfully, but these errors were encountered:
🚀 Feature
It would be nice if Theseus supported bound-constrained non-linear least-squares optimisation. For feature parity with scipy.optimize.least_squares on this topic, this would mean
dogbox
variant of the Powell's dogleg optimisertrf
) methodMotivation
Theseus already provides Trust Region and Dogleg optimisers. Hopefully these can serve as strong basis for extensions to
dogbox
andtrf
.Pitch
Having support for various constrained optimisation approaches in Theseus would be fantastic. Bound constraints is probably covering a large proportion of user needs in this space.
Alternatives
Let the user to:
Additional context
A previous feature request related to this can be found here #484. I'm not sure why it was closed.
An experimental PR for equality constraints can be found here: #457.
dogbox
reference:C. Voglis and I. E. Lagaris, “A Rectangular Trust Region Dogleg Approach for Unconstrained and Bound Constrained Nonlinear Optimization”, WSEAS International Conference on Applied Mathematics, Corfu, Greece, 2004.
https://www.cs.uoi.gr/~lagaris/papers/PREPRINTS/dogbox.pdf
trf
reference:M. A. Branch, T. F. Coleman, and Y. Li, “A Subspace, Interior, and Conjugate Gradient Method for Large-Scale Bound-Constrained Minimization Problems,” SIAM Journal on Scientific Computing, Vol. 21, Number 1, pp 1-23, 1999.
https://doi.org/10.1137/S1064827595289108
An anlternative to
dogbox
andtrf
mentioned in Ceres Solver is to use a projected LM approach with line-search as detailled in:C. Kanzow, N. Yamashita and M. Fukushima, Levenberg-Marquardt methods with strong local convergence properties for solving nonlinear equations with convex constraints, Journal of Computational and Applied Mathematics, 172(2):375-397, 2004.
https://doi.org/10.1016/j.cam.2004.02.013
The text was updated successfully, but these errors were encountered: