Skip to content

Commit ed74ca0

Browse files
authored
Update list (#29)
1 parent 89e625a commit ed74ca0

File tree

1 file changed

+11
-7
lines changed

1 file changed

+11
-7
lines changed

index.md

+11-7
Original file line numberDiff line numberDiff line change
@@ -46,8 +46,9 @@ It is worth investigating each package yourself to really understand its ins and
4646
- [JuliaDiff/ReverseDiff.jl](https://github.com/JuliaDiff/ReverseDiff.jl): Operator overloading AD backend
4747
- [FluxML/Zygote.jl](https://github.com/FluxML/Zygote.jl): Source transformation AD backend
4848
- [EnzymeAD/Enzyme.jl](https://github.com/EnzymeAD/Enzyme.jl): LLVM-level source transformation AD backend
49+
- [FluxML/Tracker.jl](https://github.com/FluxML/Tracker.jl): Operator overloading AD backend
50+
- [compintell/Tapir.jl](https://github.com/compintell/Tapir.jl): Source transformation AD backend (experimental)
4951
- [dfdx/Yota.jl](https://github.com/dfdx/Yota.jl): Source transformation AD backend
50-
- [FluxML/Tracker.jl](https://github.com/FluxML/Tracker.jl): Operator overloading AD backend (mostly deprecated in favor of Zygote.jl)
5152

5253
### Forward mode automatic differentiation
5354

@@ -72,6 +73,11 @@ It is worth investigating each package yourself to really understand its ins and
7273
- [JuliaDiff/TaylorDiff.jl](https://github.com/JuliaDiff/TaylorDiff.jl): Higher order directional derivatives (experimental)
7374
- [JuliaDiff/Diffractor.jl](https://github.com/JuliaDiff/Diffractor.jl): Source transformation AD backend (experimental)
7475

76+
### Interfaces
77+
78+
- [gdalle/DifferentiationInterface.jl](https://github.com/gdalle/DifferentiationInterface.jl): Generic interface for first- and second-order differentiation with any AD backend on 1-argument functions (`f(x) = y` or `f!(y, x)`).
79+
- [JuliaDiff/AbstractDifferentiation.jl](https://github.com/JuliaDiff/AbstractDifferentiation.jl): Generic interface for first- and second-order differentiation with a subset of AD backends on functions with more than one argument (will soon wrap DifferentiationInterface.jl).
80+
7581
### Rulesets
7682

7783
These packages define derivatives for basic functions, and enable users to do the same:
@@ -85,14 +91,11 @@ These packages define derivatives for basic functions, and enable users to do th
8591
- [EnzymeAD/EnzymeRules.jl](https://enzymead.github.io/Enzyme.jl/stable/generated/custom_rule/): Rule definition API for Enzyme.jl
8692
- [FluxML/ZygoteRules.jl](https://github.com/FluxML/ZygoteRules.jl): Some rules used by Zygote.jl (mostly deprecated in favor of ChainRules.jl).
8793

88-
### Interface
89-
90-
- [AbstractDifferentiation.jl](https://github.com/JuliaDiff/AbstractDifferentiation.jl): Backend-agnostic interface for algorithms that rely on derivatives, gradients, Jacobians, Hessians, etc.
91-
92-
### Exotic
94+
### Sparsity
9395

9496
- [JuliaDiff/SparseDiffTools.jl](https://github.com/JuliaDiff/SparseDiffTools.jl): Exploit sparsity to speed up FiniteDiff.jl and ForwardDiff.jl, as well as other algorithms.
95-
- [gaurav-arya/StochasticAD.jl](https://github.com/gaurav-arya/StochasticAD.jl): Differentiation of functions with stochastic behavior (experimental)
97+
- [adrhill/SparseConnectivityTracer.jl](https://github.com/adrhill/SparseConnectivityTracer.jl): Sparsity pattern detection for Jacobians and Hessians.
98+
- [gdalle/SparseMatrixColorings.jl](https://github.com/gdalle/SparseMatrixColorings.jl): Efficient coloring and and decompression algorithms for sparse Jacobians and Hessians.
9699

97100
### Differentiating through more stuff
98101

@@ -102,6 +105,7 @@ Some complex algorithms are not natively differentiable, which is why derivative
102105
- [gdalle/ImplicitDifferentiation.jl](https://github.com/gdalle/ImplicitDifferentiation.jl): For generic algorithms specified by output conditions, thanks to the implicit function theorem
103106
- [jump-dev/DiffOpt.jl](https://github.com/jump-dev/DiffOpt.jl): For convex optimization problems
104107
- [axelparmentier/InferOpt.jl](https://github.com/axelparmentier/InferOpt.jl): For combinatorial optimization problems
108+
- [gaurav-arya/StochasticAD.jl](https://github.com/gaurav-arya/StochasticAD.jl): Differentiation of functions with stochastic behavior (experimental)
105109

106110
### Inactive packages
107111

0 commit comments

Comments
 (0)