Skip to content

Commit 3c5e8cb

Browse files
Merge pull request #148 from SciML/revert-147-format
Revert "new format"
2 parents 4b8c521 + 3d91d17 commit 3c5e8cb

26 files changed

+846
-1364
lines changed

README.md

+22-30
Original file line numberDiff line numberDiff line change
@@ -1,33 +1,26 @@
11
# LinearSolve.jl
22

3-
[![Join the chat at https://julialang.zulipchat.com #sciml-bridged](https://img.shields.io/static/v1?label=Zulip&message=chat&color=9558b2&labelColor=389826)](https://julialang.zulipchat.com/#narrow/stream/279055-sciml-bridged)
4-
[![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](http://linearsolve.sciml.ai/stable/)
5-
[![Global Docs](https://img.shields.io/badge/docs-SciML-blue.svg)](https://docs.sciml.ai/dev/modules/LinearSolve/)
6-
7-
[![codecov](https://codecov.io/gh/SciML/LinearSolve.jl/branch/master/graph/badge.svg?token=FwXaKBNW67)](https://codecov.io/gh/SciML/LinearSolve.jl)
8-
[![Build Status](https://github.com/SciML/LinearSolve.jl/workflows/CI/badge.svg)](https://github.com/SciML/LinearSolve.jl/actions?query=workflow%3ACI)
9-
[![Build status](https://badge.buildkite.com/e0ee4d9d914eb44a43c291d78c53047eeff95e7edb7881b6f7.svg)](https://buildkite.com/julialang/linearsolve-dot-jl)
10-
11-
[![ColPrac: Contributor's Guide on Collaborative Practices for Community Packages](https://img.shields.io/badge/ColPrac-Contributor's%20Guide-blueviolet)](https://github.com/SciML/ColPrac)
12-
[![SciML Code Style](https://img.shields.io/static/v1?label=code%20style&message=SciML&color=9558b2&labelColor=389826)](https://github.com/SciML/SciMLStyle)
13-
[![Package Downloads](https://shields.io/endpoint?url=https://pkgs.genieframework.com/api/v1/badge/DiffEqSensitivity)](https://pkgs.genieframework.com?packages=LinearSolve)
3+
[![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](http://linearsolve.sciml.ai/stable)
4+
[![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](http://linearsolve.sciml.ai/dev)
5+
[![Build Status](https://github.com/SciML/LinearSolvers.jl/workflows/CI/badge.svg)](https://github.com/SciML/LinearSolvers.jl/actions)
6+
[![Coverage](https://codecov.io/gh/SciML/LinearSolvers.jl/branch/master/graph/badge.svg)](https://codecov.io/gh/SciML/LinearSolvers.jl)
147

158
Fast implementations of linear solving algorithms in Julia that satisfy the SciML
169
common interface. LinearSolve.jl makes it easy to define high level algorithms
1710
which allow for swapping out the linear solver that is used while maintaining
1811
maximum efficiency. Specifically, LinearSolve.jl includes:
1912

20-
- Fast pure Julia LU factorizations which outperform standard BLAS
21-
- KLU for faster sparse LU factorization on unstructured matrices
22-
- UMFPACK for faster sparse LU factorization on matrices with some repeated structure
23-
- MKLPardiso wrappers for handling many sparse matrices faster than SuiteSparse (KLU, UMFPACK) methods
24-
- GPU-offloading for large dense matrices
25-
- Wrappers to all of the Krylov implementations (Krylov.jl, IterativeSolvers.jl, KrylovKit.jl) for easy
26-
testing of all of them. LinearSolve.jl handles the API differences, especially with the preconditioner
27-
definitions
28-
- A polyalgorithm that smartly chooses between these methods
29-
- A caching interface which automates caching of symbolic factorizations and numerical factorizations
30-
as optimally as possible
13+
- Fast pure Julia LU factorizations which outperform standard BLAS
14+
- KLU for faster sparse LU factorization on unstructured matrices
15+
- UMFPACK for faster sparse LU factorization on matrices with some repeated structure
16+
- MKLPardiso wrappers for handling many sparse matrices faster than SuiteSparse (KLU, UMFPACK) methods
17+
- GPU-offloading for large dense matrices
18+
- Wrappers to all of the Krylov implementations (Krylov.jl, IterativeSolvers.jl, KrylovKit.jl) for easy
19+
testing of all of them. LinearSolve.jl handles the API differences, especially with the preconditioner
20+
definitions
21+
- A polyalgorithm that smartly chooses between these methods
22+
- A caching interface which automates caching of symbolic factorizations and numerical factorizations
23+
as optimally as possible
3124

3225
For information on using the package,
3326
[see the stable documentation](https://linearsolve.sciml.ai/stable/). Use the
@@ -38,9 +31,8 @@ the documentation which contains the unreleased features.
3831

3932
```julia
4033
n = 4
41-
A = rand(n, n)
42-
b1 = rand(n);
43-
b2 = rand(n);
34+
A = rand(n,n)
35+
b1 = rand(n); b2 = rand(n)
4436
prob = LinearProblem(A, b1)
4537

4638
linsolve = init(prob)
@@ -55,7 +47,7 @@ sol1.u
5547
1.8385599677530706
5648
=#
5749

58-
linsolve = LinearSolve.set_b(linsolve, b2)
50+
linsolve = LinearSolve.set_b(linsolve,b2)
5951
sol2 = solve(linsolve)
6052

6153
sol2.u
@@ -67,8 +59,8 @@ sol2.u
6759
-0.4998342686003478
6860
=#
6961

70-
linsolve = LinearSolve.set_b(linsolve, b2)
71-
sol2 = solve(linsolve, IterativeSolversJL_GMRES()) # Switch to GMRES
62+
linsolve = LinearSolve.set_b(linsolve,b2)
63+
sol2 = solve(linsolve,IterativeSolversJL_GMRES()) # Switch to GMRES
7264
sol2.u
7365
#=
7466
4-element Vector{Float64}:
@@ -78,8 +70,8 @@ sol2.u
7870
-0.4998342686003478
7971
=#
8072

81-
A2 = rand(n, n)
82-
linsolve = LinearSolve.set_A(linsolve, A2)
73+
A2 = rand(n,n)
74+
linsolve = LinearSolve.set_A(linsolve,A2)
8375
sol3 = solve(linsolve)
8476

8577
sol3.u

docs/make.jl

+13-21
Original file line numberDiff line numberDiff line change
@@ -1,30 +1,22 @@
11
using LinearSolve
22
using Documenter
33

4-
DocMeta.setdocmeta!(LinearSolve, :DocTestSetup, :(using LinearSolve); recursive = true)
4+
DocMeta.setdocmeta!(LinearSolve, :DocTestSetup, :(using LinearSolve); recursive=true)
55

66
include("pages.jl")
77

88
makedocs(
9-
sitename = "LinearSolve.jl",
10-
authors = "Chris Rackauckas",
11-
modules = [LinearSolve, LinearSolve.SciMLBase],
12-
clean = true,
13-
doctest = false,
14-
strict = [
15-
:doctest,
16-
:linkcheck,
17-
:parse_error,
18-
:example_block,
19-
# Other available options are
20-
# :autodocs_block, :cross_references, :docs_block, :eval_block, :example_block, :footnote, :meta_block, :missing_docs, :setup_block
21-
],
22-
format = Documenter.HTML(
23-
analytics = "UA-90474609-3",
24-
assets = ["assets/favicon.ico"],
25-
canonical = "https://linearsolve.sciml.ai/stable/",
26-
),
27-
pages = pages,
9+
sitename="LinearSolve.jl",
10+
authors="Chris Rackauckas",
11+
modules=[LinearSolve,LinearSolve.SciMLBase],
12+
clean=true,doctest=false,
13+
format = Documenter.HTML(analytics = "UA-90474609-3",
14+
assets = ["assets/favicon.ico"],
15+
canonical="https://linearsolve.sciml.ai/stable/"),
16+
pages=pages
2817
)
2918

30-
deploydocs(; repo = "github.com/SciML/LinearSolve.jl", devbranch = "main")
19+
deploydocs(;
20+
repo="github.com/SciML/LinearSolve.jl",
21+
devbranch="main",
22+
)

docs/pages.jl

+9-8
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,23 @@
11
# Put in a separate page so it can be used by SciMLDocs.jl
22

3-
pages = [
3+
pages=[
44
"Home" => "index.md",
55
"Tutorials" => Any[
6-
"tutorials/linear.md",
7-
"tutorials/caching_interface.md",
8-
"tutorials/efficient_large_systems.md",
6+
"tutorials/linear.md"
7+
"tutorials/caching_interface.md"
98
],
109
"Basics" => Any[
1110
"basics/LinearProblem.md",
1211
"basics/common_solver_opts.md",
1312
"basics/CachingAPI.md",
1413
"basics/Preconditioners.md",
15-
"basics/FAQ.md",
14+
"basics/FAQ.md"
15+
],
16+
"Solvers" => Any[
17+
"solvers/solvers.md"
1618
],
17-
"Solvers" => Any["solvers/solvers.md"],
1819
"Advanced" => Any[
1920
"advanced/developing.md"
2021
"advanced/custom.md"
21-
],
22-
]
22+
]
23+
]

docs/src/advanced/custom.md

+15-20
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,15 @@
11
# Passing in a Custom Linear Solver
2-
32
Julia users are building a wide variety of applications in the SciML ecosystem,
43
often requiring problem-specific handling of their linear solves. As existing solvers in `LinearSolve.jl` may not
54
be optimally suited for novel applications, it is essential for the linear solve
65
interface to be easily extendable by users. To that end, the linear solve algorithm
76
`LinearSolveFunction()` accepts a user-defined function for handling the solve. A
87
user can pass in their custom linear solve function, say `my_linsolve`, to
98
`LinearSolveFunction()`. A contrived example of solving a linear system with a custom solver is below.
10-
119
```julia
1210
using LinearSolve, LinearAlgebra
1311

14-
function my_linsolve(A, b, u, p, newA, Pl, Pr, solverdata; verbose = true, kwargs...)
12+
function my_linsolve(A,b,u,p,newA,Pl,Pr,solverdata;verbose=true, kwargs...)
1513
if verbose == true
1614
println("solving Ax=b")
1715
end
@@ -20,35 +18,32 @@ function my_linsolve(A, b, u, p, newA, Pl, Pr, solverdata; verbose = true, kwarg
2018
end
2119

2220
prob = LinearProblem(Diagonal(rand(4)), rand(4))
23-
alg = LinearSolveFunction(my_linsolve)
24-
sol = solve(prob, alg)
21+
alg = LinearSolveFunction(my_linsolve)
22+
sol = solve(prob, alg)
2523
```
26-
2724
The inputs to the function are as follows:
28-
29-
- `A`, the linear operator
30-
- `b`, the right-hand-side
31-
- `u`, the solution initialized as `zero(b)`,
32-
- `p`, a set of parameters
33-
- `newA`, a `Bool` which is `true` if `A` has been modified since last solve
34-
- `Pl`, left-preconditioner
35-
- `Pr`, right-preconditioner
36-
- `solverdata`, solver cache set to `nothing` if solver hasn't been initialized
37-
- `kwargs`, standard SciML keyword arguments such as `verbose`, `maxiters`, `abstol`, `reltol`
25+
- `A`, the linear operator
26+
- `b`, the right-hand-side
27+
- `u`, the solution initialized as `zero(b)`,
28+
- `p`, a set of parameters
29+
- `newA`, a `Bool` which is `true` if `A` has been modified since last solve
30+
- `Pl`, left-preconditioner
31+
- `Pr`, right-preconditioner
32+
- `solverdata`, solver cache set to `nothing` if solver hasn't been initialized
33+
- `kwargs`, standard SciML keyword arguments such as `verbose`, `maxiters`, `abstol`, `reltol`
3834

3935
The function `my_linsolve` must accept the above specified arguments, and return
4036
the solution, `u`. As memory for `u` is already allocated, the user may choose
4137
to modify `u` in place as follows:
42-
4338
```julia
44-
function my_linsolve!(A, b, u, p, newA, Pl, Pr, solverdata; verbose = true, kwargs...)
39+
function my_linsolve!(A,b,u,p,newA,Pl,Pr,solverdata;verbose=true, kwargs...)
4540
if verbose == true
4641
println("solving Ax=b")
4742
end
4843
u .= A \ b # in place
4944
return u
5045
end
5146

52-
alg = LinearSolveFunction(my_linsolve!)
53-
sol = solve(prob, alg)
47+
alg = LinearSolveFunction(my_linsolve!)
48+
sol = solve(prob, alg)
5449
```

docs/src/advanced/developing.md

+60-61
Original file line numberDiff line numberDiff line change
@@ -1,61 +1,60 @@
1-
# Developing New Linear Solvers
2-
3-
Developing new or custom linear solvers for the SciML interface can be done in
4-
one of two ways:
5-
6-
1. You can either create a completely new set of dispatches for `init` and `solve`.
7-
2. You can extend LinearSolve.jl's internal mechanisms.
8-
9-
For developer ease, we highly recommend (2) as that will automatically make the
10-
caching API work. Thus this is the documentation for how to do that.
11-
12-
## Developing New Linear Solvers with LinearSolve.jl Primitives
13-
14-
Let's create a new wrapper for a simple LU-factorization which uses only the
15-
basic machinery. A simplified version is:
16-
17-
```julia
18-
struct MyLUFactorization{P} <: SciMLBase.AbstractLinearAlgorithm end
19-
20-
init_cacheval(alg::MyLUFactorization, A, b, u, Pl, Pr, maxiters, abstol, reltol, verbose) =
21-
lu!(convert(AbstractMatrix, A))
22-
23-
function SciMLBase.solve(cache::LinearCache, alg::MyLUFactorization; kwargs...)
24-
if cache.isfresh
25-
A = convert(AbstractMatrix, A)
26-
fact = lu!(A)
27-
cache = set_cacheval(cache, fact)
28-
end
29-
y = ldiv!(cache.u, cache.cacheval, cache.b)
30-
SciMLBase.build_linear_solution(alg, y, nothing, cache)
31-
end
32-
```
33-
34-
The way this works is as follows. LinearSolve.jl has a `LinearCache` that everything
35-
shares (this is what gives most of the ease of use). However, many algorithms
36-
need to cache their own things, and so there's one value `cacheval` that is
37-
for the algorithms to modify. The function:
38-
39-
```julia
40-
init_cacheval(alg::MyLUFactorization, A, b, u, Pl, Pr, maxiters, abstol, reltol, verbose)
41-
```
42-
43-
is what is called at `init` time to create the first `cacheval`. Note that this
44-
should match the type of the cache later used in `solve` as many algorithms, like
45-
those in OrdinaryDiffEq.jl, expect type-groundedness in the linear solver definitions.
46-
While there are cheaper ways to obtain this type for LU factorizations (specifically,
47-
`ArrayInterfaceCore.lu_instance(A)`), for a demonstration this just performs an
48-
LU-factorization to get an `LU{T, Matrix{T}}` which it puts into the `cacheval`
49-
so its typed for future use.
50-
51-
After the `init_cacheval`, the only thing left to do is to define
52-
`SciMLBase.solve(cache::LinearCache, alg::MyLUFactorization)`. Many algorithms
53-
may use a lazy matrix-free representation of the operator `A`. Thus if the
54-
algorithm requires a concrete matrix, like LU-factorization does, the algorithm
55-
should `convert(AbstractMatrix,cache.A)`. The flag `cache.isfresh` states whether
56-
`A` has changed since the last `solve`. Since we only need to factorize when
57-
`A` is new, the factorization part of the algorithm is done in a `if cache.isfresh`.
58-
`cache = set_cacheval(cache, fact)` puts the new factorization into the cache
59-
so it's updated for future solves. Then `y = ldiv!(cache.u, cache.cacheval, cache.b)`
60-
performs the solve and a linear solution is returned via
61-
`SciMLBase.build_linear_solution(alg,y,nothing,cache)`.
1+
# Developing New Linear Solvers
2+
3+
Developing new or custom linear solvers for the SciML interface can be done in
4+
one of two ways:
5+
6+
1. You can either create a completely new set of dispatches for `init` and `solve`.
7+
2. You can extend LinearSolve.jl's internal mechanisms.
8+
9+
For developer ease, we highly recommend (2) as that will automatically make the
10+
caching API work. Thus this is the documentation for how to do that.
11+
12+
## Developing New Linear Solvers with LinearSolve.jl Primitives
13+
14+
Let's create a new wrapper for a simple LU-factorization which uses only the
15+
basic machinery. A simplified version is:
16+
17+
```julia
18+
struct MyLUFactorization{P} <: SciMLBase.AbstractLinearAlgorithm end
19+
20+
init_cacheval(alg::MyLUFactorization, A, b, u, Pl, Pr, maxiters, abstol, reltol, verbose) = lu!(convert(AbstractMatrix,A))
21+
22+
function SciMLBase.solve(cache::LinearCache, alg::MyLUFactorization; kwargs...)
23+
if cache.isfresh
24+
A = convert(AbstractMatrix,A)
25+
fact = lu!(A)
26+
cache = set_cacheval(cache, fact)
27+
end
28+
y = ldiv!(cache.u, cache.cacheval, cache.b)
29+
SciMLBase.build_linear_solution(alg,y,nothing,cache)
30+
end
31+
```
32+
33+
The way this works is as follows. LinearSolve.jl has a `LinearCache` that everything
34+
shares (this is what gives most of the ease of use). However, many algorithms
35+
need to cache their own things, and so there's one value `cacheval` that is
36+
for the algorithms to modify. The function:
37+
38+
```julia
39+
init_cacheval(alg::MyLUFactorization, A, b, u, Pl, Pr, maxiters, abstol, reltol, verbose)
40+
```
41+
42+
is what is called at `init` time to create the first `cacheval`. Note that this
43+
should match the type of the cache later used in `solve` as many algorithms, like
44+
those in OrdinaryDiffEq.jl, expect type-groundedness in the linear solver definitions.
45+
While there are cheaper ways to obtain this type for LU factorizations (specifically,
46+
`ArrayInterfaceCore.lu_instance(A)`), for a demonstration this just performs an
47+
LU-factorization to get an `LU{T, Matrix{T}}` which it puts into the `cacheval`
48+
so its typed for future use.
49+
50+
After the `init_cacheval`, the only thing left to do is to define
51+
`SciMLBase.solve(cache::LinearCache, alg::MyLUFactorization)`. Many algorithms
52+
may use a lazy matrix-free representation of the operator `A`. Thus if the
53+
algorithm requires a concrete matrix, like LU-factorization does, the algorithm
54+
should `convert(AbstractMatrix,cache.A)`. The flag `cache.isfresh` states whether
55+
`A` has changed since the last `solve`. Since we only need to factorize when
56+
`A` is new, the factorization part of the algorithm is done in a `if cache.isfresh`.
57+
`cache = set_cacheval(cache, fact)` puts the new factorization into the cache
58+
so it's updated for future solves. Then `y = ldiv!(cache.u, cache.cacheval, cache.b)`
59+
performs the solve and a linear solution is returned via
60+
`SciMLBase.build_linear_solution(alg,y,nothing,cache)`.

docs/src/basics/CachingAPI.md

+9-9
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
1-
# Caching Interface API Functions
2-
3-
```@docs
4-
LinearSolve.set_A
5-
LinearSolve.set_b
6-
LinearSolve.set_u
7-
LinearSolve.set_p
8-
LinearSolve.set_prec
9-
```
1+
# Caching Interface API Functions
2+
3+
```@docs
4+
LinearSolve.set_A
5+
LinearSolve.set_b
6+
LinearSolve.set_u
7+
LinearSolve.set_p
8+
LinearSolve.set_prec
9+
```

0 commit comments

Comments
 (0)