Skip to content

Commit

Permalink
updated readme
Browse files Browse the repository at this point in the history
  • Loading branch information
brianguenter committed Aug 1, 2024
1 parent 08cd599 commit e60aefa
Showing 1 changed file with 2 additions and 7 deletions.
9 changes: 2 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ Unlike forward and reverse mode automatic differentiation **FD** automatically g

For f:ℝⁿ->ℝᵐ with n,m large FD may have better performance than conventional AD algorithms because the **FD** algorithm finds expressions shared between partials and computes them only once. In some cases **FD** derivatives can be as efficient as manually coded derivatives (see the Lagrangian dynamics example in the [D*](https://www.microsoft.com/en-us/research/publication/the-d-symbolic-differentiation-algorithm/) paper or the Benchmarks section of the documentation for another example).

**FD** may take much less time to compute symbolic derivatives than Symbolics.jl even in the ℝ¹->ℝ¹ case[^b]. The executables generated by **FD** may also be much faster (see the documentation for more details).
**FD** may take much less time to compute symbolic derivatives than Symbolics.jl even in the ℝ¹->ℝ¹ case. The executables generated by **FD** may also be much faster (see the documentation for more details).

You should consider using FastDifferentiation when you need:
* a fast executable for evaluating the derivative of a function and the overhead of the preprocessing/compilation time is swamped by evaluation time.
Expand Down Expand Up @@ -59,8 +59,6 @@ See the documentation for more information on the capabilities and limitations o

If you use FD in your work please share the functions you differentiate with me. I'll add them to the benchmarks. The more functions available to test the easier it is for others to determine if FD will help with their problem.

This is **beta** software being modified on a daily basis. Expect bugs and frequent, possibly breaking changes, over the next month or so. Documentation is frequently updated so check the latest docs before filing an issue. Your problem may have been fixed and documented.

## FAQ

**Q**: Does **FD** support complex numbers?
Expand All @@ -70,7 +68,7 @@ This is **beta** software being modified on a daily basis. Expect bugs and frequ
**A**: **FD** stores and evaluates the common subexpressions in your function just once. But, the print function recursively descends through all expressions in the directed acyclic graph representing your function, including nodes that have already been visited. The printout can be exponentially larger than the internal **FD** representation.

**Q**: How about matrix and tensor expressions?
**A**: Evaluation of an **FD** expression returns a graph, not a number. If you multiply a matrix of **FD** variables times a vector of **FD** variables the matrix vector multiplication loop is effectively unrolled into scalar expressions. Matrix operations on large matrices will generate large executables and long preprocessing time. **FD** functions with up 10⁵ operations should still have reasonable preprocessing/compilation times (approximately 1 minute on a modern laptop) and good run time performance.
**A**: If you multiply a matrix of **FD** variables times a vector of **FD** variables the matrix vector multiplication loop is effectively unrolled into scalar expressions. Matrix operations on large matrices will generate large executables and long preprocessing time. **FD** functions with up 10⁵ operations should still have reasonable preprocessing/compilation times (approximately 1 minute on a modern laptop) and good run time performance.

**Q**: Does **FD** support conditionals?
**A**: **FD** does not yet support conditionals that involve the variables you are differentiating with respect to. You can do this:
Expand All @@ -94,7 +92,6 @@ f (generic function with 2 methods)
julia> f(x,y)
ERROR: MethodError: no method matching isless(::FastDifferentiation.Node{Symbol, 0}, ::FastDifferentiation.Node{Symbol, 0})
```
This is actively being worked on. I hope to have experimental support for conditionals soon.

# Release Notes
<details>
Expand Down Expand Up @@ -143,7 +140,5 @@ This argument is only active if rhe `in_place` argument is true.
</details>


[^b]: I am working with the SciML team to see if it is possible to integrate **FD** differentiation directly into Symbolics.jl.

[^a]: See the [D* ](https://www.microsoft.com/en-us/research/publication/the-d-symbolic-differentiation-algorithm/) paper for an explanation of derivative graph factorization.

0 comments on commit e60aefa

Please sign in to comment.