-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] Options for vector-valued functions #363
Comments
Hi there, The quoted admonition is removed in the development version of the package that will be released relatively soon. The current version uses its own nonlinear expression system, but the new version adopts the nonlinear modelling interface provided by JuMP based on In the longer term, adding array-valued nonlinear expression support is on the JuMP development roadmap (see jump-dev/MathOptInterface.jl#2402 and https://jump.dev/JuMP.jl/stable/developers/roadmap/#Development-roadmap). Once, this is supported by JuMP, InfiniteOpt will automatically inherit this ability as well. For your model, I strongly suspect it will be much more performant to directly define the ODE model in InfiniteOpt (especially with the experimental InfiniteExaModels backend) and not use MTK. However, you can also try the aforementioned workaround to instead embed the MTK function as a user-defined function; in which case I suggest trying it with the development version of InfiniteOpt (i.e., |
I have tried this workaround without much success. Also, when I tried, the Ditching MTK is not an option for me in this case for two reasons
I have spent the day implementing a primitive trajectory-optimization package that performs the direct collocation transcription and have obtained reasonable performance, except for computing the Hessian of the Lagrangian, which is exactly where JuMP is struggling as well. Sparse forward-over reverse mode AD does best, but not quite well enough. I have tried MadDiff.jl and FastDifferentiation.jl before, and they are blazingly fast for the operations they support, but I've found the support to be too limited to be workable for generic MTK models. I will give the InfiniteExaModels package a quick spin though to see if it improves upon the JuMP AD in this case :) |
Thanks for the clarification on your use case. This certainly provides a compelling argument for the proposed array-valued AD system in JuMP. Coming up with an efficient approach to compute the Hessian for these types of problems is certainly a challenge and one of the reasons that the JuMP milestone is taking some time. Out of curiosity, did the memoize approach fail using the master branch? If so, can you provide some more details on the problem that caused the failure? This would be helpful info to consider going forward with the planned JuMP developments. Depending on the types of operations and the length of the traced expressions, InfiniteExaModels may or may not work well. It is really intended for models the define the ODEs/PDEs directly in InfiniteOpt. Perhaps writing a bridge from MTK directly to an InfiniteModel would be better than generating the evaluation function, but of course this would require some development. I have considered such a bridge in the past, but I don't know MTK that well and I have limited development time these days. |
No, it was with latest release. I think that the problem was related to the vast number of methods that were registered by the |
Actually it does. Now the |
Hello! I am in a very similar boat to @baggepinnen , I have an MTK model sprinkled with some vector-in, vector-out functions that would probably work best behind an @baggepinnen , did you have any luck with using Thanks to you both! |
I did not manage to make use of the |
Hello there 👋 I'm writing regarding the following admonition in the docs :)
I am solving optimal-control problems where the dynamics are encoded in the form of a ModelingToolkit model. ModelingToolkit can generate very efficient code for the right-hand side of the dynamics, i.e., I can obtain an executable function$f$ in
where$x$ and $u$ are potentially high-dimensional (length(x) == 10) in the particular example I'm working on right now).
Tracing through$f$ with JuMP variables (infinite variables) "works", with plenty of hacks, but the result is very slow (the traced expression for $\dot x$ is too large to even print to the REPL without crashing julia, while evaluating the code generated by MTK takes 1µs only). The issue is that the code emitted by MTK contains tons of common sub expressions and inlined solutions of linear systems etc ($f$ is the dynamics of a high-dimensional multibody system), this is all handled very efficiently in $f$ , where memory for solving the linear systems is manually allocated and freed etc. making $f$ GC allocation free.
To avoid explosive expression growth when tracing through$f$ with JuMP variables, I replace the linear-system solves $f$ with the following
x = A\b
that appear ini.e., I introduce temporary variables
x
and equality constraintsA*x == b
. As I said, this works, but memory requirements are sky high and I believe it should be possible to improve the performance by at least 1000x over the performance this gives me.Coming back to the admonition, I have been thinking about different ways to work around the limitations on vector-valued nonlinear functions but haven't found any alternative that is working out for me. What approach did you have in mind with
?
I am willing to jump through quite a few hoops to make this efficient if required :) Thanks for your time!
The text was updated successfully, but these errors were encountered: