You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: index.md
+11-7
Original file line number
Diff line number
Diff line change
@@ -46,8 +46,9 @@ It is worth investigating each package yourself to really understand its ins and
46
46
-[JuliaDiff/ReverseDiff.jl](https://github.com/JuliaDiff/ReverseDiff.jl): Operator overloading AD backend
47
47
-[FluxML/Zygote.jl](https://github.com/FluxML/Zygote.jl): Source transformation AD backend
48
48
-[EnzymeAD/Enzyme.jl](https://github.com/EnzymeAD/Enzyme.jl): LLVM-level source transformation AD backend
49
+
-[FluxML/Tracker.jl](https://github.com/FluxML/Tracker.jl): Operator overloading AD backend
50
+
-[compintell/Tapir.jl](https://github.com/compintell/Tapir.jl): Source transformation AD backend (experimental)
49
51
-[dfdx/Yota.jl](https://github.com/dfdx/Yota.jl): Source transformation AD backend
50
-
-[FluxML/Tracker.jl](https://github.com/FluxML/Tracker.jl): Operator overloading AD backend (mostly deprecated in favor of Zygote.jl)
51
52
52
53
### Forward mode automatic differentiation
53
54
@@ -72,6 +73,11 @@ It is worth investigating each package yourself to really understand its ins and
72
73
-[JuliaDiff/TaylorDiff.jl](https://github.com/JuliaDiff/TaylorDiff.jl): Higher order directional derivatives (experimental)
73
74
-[JuliaDiff/Diffractor.jl](https://github.com/JuliaDiff/Diffractor.jl): Source transformation AD backend (experimental)
74
75
76
+
### Interfaces
77
+
78
+
-[gdalle/DifferentiationInterface.jl](https://github.com/gdalle/DifferentiationInterface.jl): Generic interface for first- and second-order differentiation with any AD backend on 1-argument functions (`f(x) = y` or `f!(y, x)`).
79
+
-[JuliaDiff/AbstractDifferentiation.jl](https://github.com/JuliaDiff/AbstractDifferentiation.jl): Generic interface for first- and second-order differentiation with a subset of AD backends on functions with more than one argument (will soon wrap DifferentiationInterface.jl).
80
+
75
81
### Rulesets
76
82
77
83
These packages define derivatives for basic functions, and enable users to do the same:
@@ -85,14 +91,11 @@ These packages define derivatives for basic functions, and enable users to do th
85
91
-[EnzymeAD/EnzymeRules.jl](https://enzymead.github.io/Enzyme.jl/stable/generated/custom_rule/): Rule definition API for Enzyme.jl
86
92
-[FluxML/ZygoteRules.jl](https://github.com/FluxML/ZygoteRules.jl): Some rules used by Zygote.jl (mostly deprecated in favor of ChainRules.jl).
87
93
88
-
### Interface
89
-
90
-
-[AbstractDifferentiation.jl](https://github.com/JuliaDiff/AbstractDifferentiation.jl): Backend-agnostic interface for algorithms that rely on derivatives, gradients, Jacobians, Hessians, etc.
91
-
92
-
### Exotic
94
+
### Sparsity
93
95
94
96
-[JuliaDiff/SparseDiffTools.jl](https://github.com/JuliaDiff/SparseDiffTools.jl): Exploit sparsity to speed up FiniteDiff.jl and ForwardDiff.jl, as well as other algorithms.
95
-
-[gaurav-arya/StochasticAD.jl](https://github.com/gaurav-arya/StochasticAD.jl): Differentiation of functions with stochastic behavior (experimental)
97
+
-[adrhill/SparseConnectivityTracer.jl](https://github.com/adrhill/SparseConnectivityTracer.jl): Sparsity pattern detection for Jacobians and Hessians.
98
+
-[gdalle/SparseMatrixColorings.jl](https://github.com/gdalle/SparseMatrixColorings.jl): Efficient coloring and and decompression algorithms for sparse Jacobians and Hessians.
96
99
97
100
### Differentiating through more stuff
98
101
@@ -102,6 +105,7 @@ Some complex algorithms are not natively differentiable, which is why derivative
102
105
-[gdalle/ImplicitDifferentiation.jl](https://github.com/gdalle/ImplicitDifferentiation.jl): For generic algorithms specified by output conditions, thanks to the implicit function theorem
103
106
-[jump-dev/DiffOpt.jl](https://github.com/jump-dev/DiffOpt.jl): For convex optimization problems
104
107
-[axelparmentier/InferOpt.jl](https://github.com/axelparmentier/InferOpt.jl): For combinatorial optimization problems
108
+
-[gaurav-arya/StochasticAD.jl](https://github.com/gaurav-arya/StochasticAD.jl): Differentiation of functions with stochastic behavior (experimental)
0 commit comments