-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for AbstractDifferentiation.jl / DifferentiationInterface.jl #37
Comments
I'm not going to say no to volunteer labor! I looked at the docs for AbstractDifferentiation.jl. It shouldn't be hard to implement but making it efficient will be tricky because this interface seems to be designed for code interpreters. FD is fundamentally a compiler. A naive implementation would have FD generating and factoring the derivative graph and compiling the runtime generated function at each call to This would be ludicrously slow. Also, there is a single IdDict global expression cache which is not thread safe. You can't multithread calls to jacobian, hessian, etc. This is a documented limitation of FD, first sentence on the Examples doc page. Maybe it needs to be more prominent. Someone using the
My assumption in writing FD was the typical workflow would be something like:
This will only mesh well with IdDict's are not thread safe in Julia. There is ThreadedDicts.jl but not sure what its performance would be like, probably slow, also not sure if it supports IdDicts. If you can make the caching efficient for multithreaded code then I'm for it. Without caching I'd rather not support the interface - FD would be unusably slow when accessed through the interface. I believe Enzyme has this same problem. They can't be recompiling derivative functions every time they are called, there must be a cache. Maybe they figured out how to make this efficient and we could mimic them. I've pinged Billy Moses to see if they've got a solution they can share. I believe the pushforward is equivalent to the Jv function, and the pullback is equivalent to J'v. Give it a shot and don't hesitate to ping me if you have questions. |
Okay that makes plenty of sense, looks a bit high risk low reward for now |
should we clost this issue then?Or should I move it to discussions, where it would be easier to access when we decide to do this in the future? By the way thank you for fixing the doc strings. That was a lot of work. |
A discussion sounds good! I have high hopes for AbstractDifferentiation but it needs more love anyway |
And you're welcome for the docstrings. I was mostly scratching an itch ^^ |
@brianguenter I'm happy to announce that FastDifferentiation.jl is now supported by my new package https://github.com/gdalle/DifferentiationInterface.jl You can check out the implementation at https://github.com/gdalle/DifferentiationInterface.jl/blob/main/ext/DifferentiationInterfaceFastDifferentiationExt/allocating.jl. There is still some performance to be gained, but most of it is there |
In particular, the package is designed around a 2-step "prepare, then differentiate" paradigm. In this setup, it makes a lot of sense to generate the runtime functions during "prepare", and use them during "differentiate", so that's what I did. |
That's great, always happy when somebody uses my code. |
Not only that, but I think it might make it easier for more people to use your code, because now it is as simple as switching the backend in DifferentiationInterface. Side note: did you change your user name? |
I have separate github accounts for personal and professional use. Accidentally responded to your message while I was logged in on my personal account. |
I think we can close this issue since
|
AbstractDifferentiation.jl is an interface that makes it easier to call various autodiff backends with the same syntax. I think it would be nice to add bindings as an extension to FastDifferentiation.jl. I can even give it a shot if you agree.
The text was updated successfully, but these errors were encountered: