Skip to content

Commit

Permalink
Merge pull request #772 from SciML/Vaibhavdixit02-patch-1
Browse files Browse the repository at this point in the history
Change the order of args in `cons_vjp` and `cons_jvp`
  • Loading branch information
Vaibhavdixit02 authored Aug 29, 2024
2 parents 252625c + 0c8bb2f commit 0df7b81
Show file tree
Hide file tree
Showing 3 changed files with 14 additions and 9 deletions.
12 changes: 6 additions & 6 deletions src/scimlfunctions.jl
Original file line number Diff line number Diff line change
Expand Up @@ -1823,16 +1823,16 @@ function described in [Callback Functions](https://docs.sciml.ai/Optimization/st
then `hess(H,u,p,args...)` or `H=hess(u,p,args...)` should be used.
- `hv(Hv,u,v,p)` or `Hv=hv(u,v,p)`: the Hessian-vector product ``(d^2 f / du^2) v``. If `f` takes additional arguments
then `hv(Hv,u,v,p,args...)` or `Hv=hv(u,v,p, args...)` should be used.
- `cons(res,x,p)` or `res=cons(x,p)` : the constraints function, should mutate the passed `res` array
- `cons(res,u,p)` or `res=cons(u,p)` : the constraints function, should mutate the passed `res` array
with value of the `i`th constraint, evaluated at the current values of variables
inside the optimization routine. This takes just the function evaluations
and the equality or inequality assertion is applied by the solver based on the constraint
bounds passed as `lcons` and `ucons` to [`OptimizationProblem`](@ref), in case of equality
constraints `lcons` and `ucons` should be passed equal values.
- `cons_j(J,x,p)` or `J=cons_j(x,p)`: the Jacobian of the constraints.
- `cons_jvp(Jv,v,x,p)` or `Jv=cons_jvp(v,x,p)`: the Jacobian-vector product of the constraints.
- `cons_vjp(Jv,v,x,p)` or `Jv=cons_vjp(v,x,p)`: the Jacobian-vector product of the constraints.
- `cons_h(H,x,p)` or `H=cons_h(x,p)`: the Hessian of the constraints, provided as
- `cons_j(J,u,p)` or `J=cons_j(u,p)`: the Jacobian of the constraints.
- `cons_jvp(Jv,u,v,p)` or `Jv=cons_jvp(u,v,p)`: the Jacobian-vector product of the constraints.
- `cons_vjp(Jv,u,v,p)` or `Jv=cons_vjp(u,v,p)`: the Jacobian-vector product of the constraints.
- `cons_h(H,u,p)` or `H=cons_h(u,p)`: the Hessian of the constraints, provided as
an array of Hessians with `res[i]` being the Hessian with respect to the `i`th output on `cons`.
- `hess_prototype`: a prototype matrix matching the type that matches the Hessian. For example,
if the Hessian is tridiagonal, then an appropriately sized `Hessian` matrix can be used
Expand All @@ -1845,7 +1845,7 @@ function described in [Callback Functions](https://docs.sciml.ai/Optimization/st
This is defined as an array of matrices, where `hess[i]` is the Hessian w.r.t. the `i`th output.
For example, if the Hessian is sparse, then `hess` is a `Vector{SparseMatrixCSC}`.
The default is `nothing`, which means a dense constraint Hessian.
- `lag_h(res,x,sigma,mu,p)` or `res=lag_h(x,sigma,mu,p)`: the Hessian of the Lagrangian,
- `lag_h(res,u,sigma,mu,p)` or `res=lag_h(u,sigma,mu,p)`: the Hessian of the Lagrangian,
where `sigma` is a multiplier of the cost function and `mu` are the Lagrange multipliers
multiplying the constraints. This can be provided instead of `hess` and `cons_h`
to solvers that directly use the Hessian of the Lagrangian.
Expand Down
7 changes: 5 additions & 2 deletions src/solutions/ode_solutions.jl
Original file line number Diff line number Diff line change
Expand Up @@ -605,11 +605,14 @@ function sensitivity_solution(sol::ODESolution, u, t)
return @set sol.interp = interp
end

struct LazyInterpolationException <: Exception
struct LazyInterpolationException <: Exception
var::Symbol
end

Base.showerror(io::IO, e::LazyInterpolationException) = print(io, "The algorithm", e.var, " uses lazy interpolation, which is incompatible with `strip_solution`.")
function Base.showerror(io::IO, e::LazyInterpolationException)
print(io, "The algorithm", e.var,
" uses lazy interpolation, which is incompatible with `strip_solution`.")
end

function strip_solution(sol::ODESolution)
if has_lazy_interpolation(sol.alg)
Expand Down
4 changes: 3 additions & 1 deletion src/solutions/optimization_solutions.jl
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ Representation of the solution to a non-linear optimization defined by an Optimi
- `retcode`: the return code from the solver. Used to determine whether the solver solved
successfully or whether it exited due to an error. For more details, see
[the return code documentation](https://docs.sciml.ai/SciMLBase/stable/interfaces/Solutions/#retcodes).
- `original`: if the solver is wrapped from an alternative solver ecosystem, such as
- `original`: if the solver is wrapped from a external solver, e.g.
Optim.jl, then this is the original return from said solver library.
- `stats`: statistics of the solver, such as the number of function evaluations required.
"""
Expand Down Expand Up @@ -211,6 +211,8 @@ Base.@propagate_inbounds function Base.getproperty(x::AbstractOptimizationSoluti
Base.depwarn("`sol.minimizer` is deprecated. Use `sol.u` instead.",
"sol.minimizer")
return getfield(x, :u)
elseif s === :x
return getfield(x, :u)
elseif s === :minimum
Base.depwarn("`sol.minimum` is deprecated. Use `sol.objective` instead.",
"sol.minimum")
Expand Down

0 comments on commit 0df7b81

Please sign in to comment.