Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatically wrapping with Clang.jl #149

Draft
wants to merge 83 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
83 commits
Select commit Hold shift + click to select a range
e7a2b61
Format doc string function
Aug 4, 2021
2336629
Start to rework using Clang.jl generated wrappers
Jul 19, 2021
c85e8e8
Add back in options
Jul 19, 2021
8c17b12
Add back in vec.jl
Jul 19, 2021
32c044e
Add back in mat.jl
Jul 19, 2021
b25f87c
Add back matshell
Jul 19, 2021
90086fe
Remove some `getlibs`s and VecSeq->VecSeqWithArray
Jul 20, 2021
e5b9751
Add VecMPI
Jul 20, 2021
f814d21
Add VecGhost
Jul 20, 2021
105fb57
Add getpetsctype for vectors
Jul 20, 2021
922d74f
Update matshell and with_localarray name
Jul 21, 2021
a14f9e8
Clean up some function comments and names
Jul 21, 2021
a0013a4
Add a parallel matrix type and test
Jul 21, 2021
21d9e05
Update generator for KSP
Aug 3, 2021
6e7b0b7
Update options
Aug 4, 2021
c8b91b4
Add (basic) ksp functionality
Aug 4, 2021
208bd91
Update generator for DM
Aug 4, 2021
bb45f97
Fix generator for MatStencil
Aug 5, 2021
2924799
Add in basic dm and dmda functionality
Aug 5, 2021
2e708a9
Move old examples to "stale" folder
Aug 6, 2021
a605eb3
Move unused tests to "stale" folder
Aug 6, 2021
31c2286
Enable examples and mpi_examples testing
Aug 6, 2021
05da75f
Update MPI test output
Aug 6, 2021
2fb5c5d
Add a Julia sparse matrix to PETSc MatSeqAIJ
Aug 6, 2021
df23af1
Add function to get library with types
Aug 6, 2021
d59de98
Allow KSP with Julia vectors and SparseMatrixCSC
Aug 6, 2021
f6cb8d4
Add Laplacian example
Aug 6, 2021
94689e0
Add a petsc world age
Aug 7, 2021
9bf780e
Register finalizers based on MPI.Comm_size
Aug 7, 2021
5bc18bb
Update .gitignore
Aug 7, 2021
67e9d41
Update Project.toml
Aug 7, 2021
59c1f33
Disable dmda test with MPI and Windows
Aug 7, 2021
1edae83
Disable examples/laplacian.jl on windows
Aug 7, 2021
d0e01e8
Move to test/Project.toml
Aug 9, 2021
ad4e9aa
Allow for a Project.toml in examples
Aug 9, 2021
ef22f8c
Rework KSP a bit to simplify cases
Aug 9, 2021
0adc67c
Add KSP / DMDA interaction routines
Aug 10, 2021
c7e6d67
Update options parsing
Aug 10, 2021
b623b2f
Remove MPI.Init from precompile
Aug 11, 2021
2abbd8d
Add dmda_laplacian example
Aug 10, 2021
d3efdde
Remove some old dmda code
Aug 12, 2021
1e59f7c
All read/write tuples for withlocalarray!
Aug 12, 2021
19d9bb0
Add local and global vectors to DM
Aug 18, 2021
d6597fa
Partially add DM coord
Aug 18, 2021
caf1f70
Add view for DM
Aug 19, 2021
2675fd6
Fix up generator for SNES
Aug 12, 2021
ae24f14
Add back SNES functionality
Aug 20, 2021
7fdff85
Add better options parsing
Aug 31, 2021
208178c
Add coord and reshape utils to dmda
boriskaus Aug 31, 2021
2c7ff3d
Add SNES example (Liouville-Bratu-Gelfand equation)
Aug 20, 2021
fb8f5d4
Add copyto! between Julia and PETSc sparse matrix
boriskaus Aug 31, 2021
16d3fbf
Add porosity waves example
boriskaus Aug 31, 2021
581f379
Fix viewer for long output on stdout
Sep 3, 2021
f53f225
Start to add back dmstag
Sep 13, 2021
73d14eb
Few corrections to DMDA
Sep 13, 2021
4b815f3
Start building out DMStag
Sep 13, 2021
ed7ec88
Add dmstag query of boundary type
Sep 13, 2021
53ac1fa
Add 2D and 3D DMStag support
Sep 14, 2021
e11044f
Porosity waves example: fix capitalization in example command
psanan Apr 5, 2022
65b350a
Add to function DMStag (CreateCompatible) and getdof + tests
nicoberlie Apr 5, 2022
a22bba9
clean up getdof output
nicoberlie Apr 5, 2022
24d1e43
Clean up DMStag input (petsclib gotten from DM))
nicoberlie Apr 5, 2022
897178d
Docs: describe low- and high-level components, add low-level ccall ex…
psanan Apr 5, 2022
93175f9
Add coordinates functions (work in progress)
nicoberlie Apr 7, 2022
7d2ad40
Added a few functions + offset array (work in progress))
nicoberlie May 7, 2022
747403f
Merge pull request #186 from nicoberlie/jek/dmstag
boriskaus Apr 11, 2023
6f78d51
Try to compile with Julia 1.11
ViralBShah Oct 14, 2024
52a35f0
loads again (but that's about it; the rest is deactivated
boriskaus Nov 5, 2024
b6dc09a
activate some tests again
boriskaus Nov 5, 2024
44ae7b3
Merge remote-tracking branch 'origin/jek/dmstag' into jek/gen
boriskaus Nov 6, 2024
70f3eed
add dmstag changes
boriskaus Nov 6, 2024
0debe97
fixed a few tests
boriskaus Nov 6, 2024
78d3251
activate more tests
boriskaus Nov 6, 2024
abdd1e4
few additional tests
boriskaus Nov 6, 2024
65ef1dc
more (DMStag) functions and structs included
boriskaus Nov 12, 2024
fb32fc1
missing file
boriskaus Nov 12, 2024
5015b81
wrap more functions
boriskaus Nov 15, 2024
3bb4a84
autowrapping DMSTAG
boriskaus Nov 15, 2024
63e5fc3
activate more dmstag tests
boriskaus Nov 15, 2024
f38bc75
typo
boriskaus Nov 15, 2024
9238f21
activate dmda tests
boriskaus Nov 15, 2024
81ced12
add dm_wrapped, which wraps many of the dm routines (most not tested …
boriskaus Jan 2, 2025
d1b60a3
bigfix; simplify naming of tests
boriskaus Jan 2, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 6 additions & 8 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,10 +1,8 @@
deps/ComplexDouble/
deps/RealDouble/
deps/RealSingle/
deps/deps.jl
deps/petsc-*.tar.gz
*.jl.cov
*.jl.mem
*.swp
*.DS_Store
*~
docs/build/
docs/site/
Manifest.toml
Manifest.toml
*.vscode/
examples/viz2D_out/
11 changes: 8 additions & 3 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,21 @@ authors = ["Simon Byrne <[email protected]>"]
version = "0.1.3"

[deps]
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
Libdl = "8f399da3-3557-5675-b5ff-fb832c97cbdb"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
MPI = "da04e1cc-30fd-572f-bb4f-1f8673147195"
MPICH_jll = "7cb0a576-ebde-5e09-9194-50597f1243b4"
OffsetArrays = "6fe1bfb0-de20-5000-8ca7-80f57d26f881"
PETSc_jll = "8fa3689e-f0b9-5420-9873-adf6ccf46f2d"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
SparseDiffTools = "47a9eef4-7e08-11e9-0b38-333d64bd3804"

[compat]
MPI = "0.15, 0.16, 0.17, 0.18, 0.19"
PETSc_jll = "3.15.2"
julia = "1.3"
MPI = "0.20"
MPICH_jll = "4.1.2"
PETSc_jll = "3.21"
julia = "1.9"

[extras]
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
Expand Down
16 changes: 15 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,22 @@
[![Build Status](https://github.com/JuliaParallel/PETSc.jl/workflows/CI/badge.svg)](https://github.com/JuliaParallel/PETSc.jl/actions/workflows/ci.yml)
[![doc dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://juliaparallel.github.io/PETSc.jl/dev/)

PETSc.jl provides an interface to the [PETSc](https://www.mcs.anl.gov/petsc/) library,
allowing the combination of Julia features (such as automatic differentiation) with the PETSc's infrastructure, including
linear, nonlinear, and optimization solvers, timesteppers, domain management (DM), and more, in a distributed-memory (MPI) environment.

This package comprises two main components:

1. An automatically generated, low-level interface for large parts of the PETSc API.
2. A curated, high-level, more Julianic interface for selected functionality.

The low-level interface covers the entire PETSc API, but may be awkward to work with and likely requires
previous experience with PETSc to use effectively. The high level interface is designed to be
more familiar and convenient for Julia users but exposes only a small portion of the functionality
of the underlying library. This high-level interface should be considered unstable, as its
implementation involves design decisions which may be revisited - note the low version number
of this package if relying on these high-level features.

This package provides a low level interface for [PETSc](https://www.mcs.anl.gov/petsc/) and allows combining julia features (such as automatic differentiation) with the PETSc infrastructure and nonlinear solvers.

## Installation

Expand Down
48 changes: 46 additions & 2 deletions docs/src/index.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,51 @@
# PETSc.jl

[PETSc.jl](https://github.com/JuliaParallel/PETSc.jl) is a Julia wrapper for the Portable, Extensible Toolkit for Scientific Computation [PETSc](https://petsc.org/release/documentation/manual/) package, which allows solving ordinary and partial differential equations in parallel on laptops or massively parallel high-performance systems.
[PETSc.jl](https://github.com/JuliaParallel/PETSc.jl) is a Julia wrapper for the Portable, Extensible Toolkit for Scientific Computation [PETSc](https://petsc.org/) package, which allows solving ordinary and partial differential equations in parallel on laptops or massively parallel high-performance systems.

The use of Julia greatly simplifies the code that developers have to write, while allowing to employ Julia features such as automatic differentiation. The Julia wrapper also comes with a pre-built library, which greatly simplifies the process of getting your first code working in parallel, on different operating systems. In many cases, the Julia code is significantly shorter than its C counterpart.

This wrapper mimics the PETSc-functionality as closely as possible, but remains work in progress (meaning that not everything has been translated yet). See the official [user guide](https://petsc.org/release/overview/) if you want to learn more about PETSc in general. For Julia-specific examples, have a look at our [examples](https://github.com/JuliaParallel/PETSc.jl/tree/main/examples) or [tests](https://github.com/JuliaParallel/PETSc.jl/tree/main/test).
This wrapper mimics the PETSc-functionality as closely as possible, but remains work in progress (meaning that not everything has been translated yet). See the official [user guide](https://petsc.org/release/overview/) if you want to learn more about PETSc in general. For Julia-specific examples, have a look at our [examples](https://github.com/JuliaParallel/PETSc.jl/tree/main/examples) or [tests](https://github.com/JuliaParallel/PETSc.jl/tree/main/test).

This package includes a low-level, automatically-generated wrapper layer, upon which a higher-level interface is built.


## The High-Level Interface

The high level interface is designed to be familiar and convenient for Julia users, but exposes only a small portion of the functionality
of the underlying PETSc library. This interface should be considered *unstable*, as its implementation involves design decisions which are likely to be revisited - note the low version number
of this package if considering relying on these features.

For example, with this interface, PETSc's [KSP](https://petsc.org/release/docs/manual/ksp) linear solvers (including Krylov methods) can be used in a way similar to solvers from other Julia packages. See the example in [Getting started](@ref) and the API in [KSP](@ref).

## The Low-Level Interface

The low-level interface covers more of the PETSc API, but may be awkward to work with and likely requires
previous experience with PETSc to use effectively. It is automatically generated with [Clang.jl](https://github.com/JuliaInterop/Clang.jl).

The high-level interface described in [KSP](@ref) creates a [KSP](https://petsc.org/release/docs/manual/ksp) linear solver object via the low-level interface to [`KSPSolve()`](https://petsc.org/release/docs/manualpages/KSP/KSPCreate.html), with the use of constructs which require knowledge of PETSc's nature as a [C library](https://docs.julialang.org/en/v1/manual/calling-c-and-fortran-code/). Expert users are of course free to directly use the low level interface, as in this simple example which directly calls [`PetscGetVersionNumber()`](https://petsc.org/release/docs/manualpages/PetscGetVersionNumber.html).

```@example
using MPI
MPI.Initialized() || MPI.Init()
using PETSc

petsclib = PETSc.petsclibs[1]
PetscInt = petsclib.PetscInt
PetscErrorCode = Cint

PETSc.initialize(petsclib)
major = Ref{PetscInt}(0)
minor = Ref{PetscInt}(0)
subminor = Ref{PetscInt}(0)
release = Ref{PetscInt}(0)
error_code = ccall(
(:PetscGetVersionNumber, petsclib.petsc_library),
PetscErrorCode,
(Ptr{PetscInt}, Ptr{PetscInt}, Ptr{PetscInt}, Ptr{PetscInt}),
major, minor, subminor, release
)

version = (major[], minor[], subminor[])
println("PETSc $(version[1]).$(version[2]).$(version[3])")
PETSc.finalize(petsclib)
```
276 changes: 276 additions & 0 deletions examples/Liouville_Bratu_Gelfand.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,276 @@
# INCLUDE IN MPI TEST
#=
In this example we solve the [Liouville–Bratu–Gelfand
equation](https://en.wikipedia.org/wiki/Liouville%E2%80%93Bratu%E2%80%93Gelfand_equation):
```math
∇² ψ + λ exp(ψ)
```
with zero Dirichlet boundary conditions.

To solve the problem we use the standard central, finite difference
approximation of the Laplacian in `d`-dimensions

This example is motivated by the following PETSc examples:
- [`snes/ex5`](https://petsc.org/release/src/snes/tutorials/ex5.c.html)
- [`snes/ex14`](https://petsc.org/release/src/snes/tutorials/ex14.c.html)
=#

using MPI
using PETSc
using OffsetArrays: OffsetArray
using LinearAlgebra: norm

opts = if !isinteractive()
PETSc.parse_options(ARGS)
else
(ksp_monitor = true, ksp_view = true)
end

# Set our MPI communicator
comm = MPI.COMM_WORLD

# Set our PETSc Scalar Type
PetscScalar = Float64

# get the PETSc lib with our chosen `PetscScalar` type
petsclib = PETSc.getlib(; PetscScalar = PetscScalar)

# Initialize PETSc
PETSc.initialize(petsclib)

# dimensionality of the problem
dim = haskey(opts, :dim) ? opts.dim : 3

# Set the total number of grid points in each direction
Nq = ntuple(_ -> 10, dim)

# Set the boundary conditions on each side
bcs = ntuple(_ -> PETSc.DM_BOUNDARY_NONE, dim)

# Set parameter
λ = PetscScalar(6)

# Create the PETSC dmda object
da = PETSc.DMDA(
petsclib,
comm,
bcs, # boundary conditions
Nq, # Global grid size
1, # Number of DOF per node
1, # Stencil width
PETSc.DMDA_STENCIL_STAR; # Stencil type
opts...,
)

# Create the PETSC snes object
snes = PETSc.SNES(petsclib, comm; opts...)

# add the da to the snes
PETSc.setDM!(snes, da)

# Set up the initial guess
x = PETSc.DMGlobalVec(da)
xl = PETSc.DMLocalVec(da)
PETSc.withlocalarray!(xl; read = false) do l_x
corners = PETSc.getcorners(da)

# Get the global grid dimensions
Nq = PETSc.getinfo(da).global_size

# Figure out the interior points
int_min = min(CartesianIndex(corners.size), CartesianIndex(2, 2, 2))
int_max = max(CartesianIndex(corners.size .- 1), CartesianIndex(1, 1, 1))
interior = (int_min):(int_max)

# Allows us to adress the local array with global indexing
ox = @view PETSc.reshapelocalarray(l_x, da)[1, :, :, :]

# Set up the global coordinates in each direction
# -1 to 1 when Nq > 1 and 0 otherwise
coords = map(
Nq ->
Nq == 1 ? range(PetscScalar(0), stop = 0, length = Nq) :
range(-PetscScalar(1), stop = 1, length = Nq),
Nq,
)

scaling = λ / (λ + 1)

# Loop over all the points on the processor and set the initial condition to
# be a hat function
for i in ((corners.lower):(corners.upper))
ox[i] =
scaling * sqrt(
min(
(1 - abs(coords[1][i[1]])),
(1 - abs(coords[2][i[2]])),
(1 - abs(coords[3][i[3]])),
),
)
end
end
PETSc.update!(x, xl, PETSc.INSERT_VALUES) # update local -> global

# Set up the nonlinear function
r = similar(x)
PETSc.setfunction!(snes, r) do g_fx, snes, g_x
# Get the DMDA associated with the snes
da = PETSc.getDMDA(snes)

# Get a local vector and transfer the data from the global vector into it
l_x = PETSc.DMLocalVec(da)
PETSc.update!(l_x, g_x, PETSc.INSERT_VALUES)

ghostcorners = PETSc.getghostcorners(da)
corners = PETSc.getcorners(da)

# Global grid size
Nq = PETSc.getinfo(da).global_size

# grid spacing in each dimension
Δx, Δy, Δz = PetscScalar(1) ./ Nq

# Get local arrays
PETSc.withlocalarray!(
(g_fx, l_x);
read = (false, true),
write = (true, false),
) do fx, x

# reshape the array and allow for global indexing
x = @view PETSc.reshapelocalarray(x, da)[1, :, :, :]
fx = @view PETSc.reshapelocalarray(fx, da)[1, :, :, :]

# Store a tuple of stencils in each direction
stencils = (
(
CartesianIndex(-1, 0, 0),
CartesianIndex(0, 0, 0),
CartesianIndex(1, 0, 0),
),
(
CartesianIndex(0, -1, 0),
CartesianIndex(0, 0, 0),
CartesianIndex(0, 1, 0),
),
(
CartesianIndex(0, 0, -1),
CartesianIndex(0, 0, 0),
CartesianIndex(0, 0, 1),
),
)
# Weights for each direction
weights = (Δy * Δz / Δx, Δx * Δz / Δy, Δx * Δy / Δz)

# loop over indices and set the function value
for ind in ((corners.lower):(corners.upper))
# If on the boundary just set equal to the incoming data
# otherwise apply the finite difference operator
if any(ntuple(j -> ind[j] == 1 || ind[j] == Nq[j], dim))
fx[ind] = x[ind]
else
# Apply the source
u = -Δx * Δy * Δz * λ * exp(x[ind])

# Apply the finite diffference stencil
for (s, w) in zip(stencils[1:dim], weights[1:dim])
u +=
w * (-x[ind + s[1]] + 2 * x[ind + s[2]] - x[ind + s[3]])
end
fx[ind] = u
end
end
end

# Clean up the local vector
PETSc.destroy(l_x)
return 0
end

J = PETSc.MatAIJ(da)
PETSc.setjacobian!(snes, J) do J, snes, g_x
# Get the DMDA associated with the snes
da = PETSc.getDMDA(snes)

# Get the corners of the points we own
corners = PETSc.getcorners(da)

# Global grid size
Nq = PETSc.getinfo(da).global_size

# grid spacing in each dimension
Δx, Δy, Δz = PetscScalar(1) ./ Nq

# Store a tuple of stencils in each direction
stencils = (
(
CartesianIndex(-1, 0, 0),
CartesianIndex(0, 0, 0),
CartesianIndex(1, 0, 0),
),
(
CartesianIndex(0, -1, 0),
CartesianIndex(0, 0, 0),
CartesianIndex(0, 1, 0),
),
(
CartesianIndex(0, 0, -1),
CartesianIndex(0, 0, 0),
CartesianIndex(0, 0, 1),
),
)
# Weights for each direction
weights = (Δy * Δz / Δx, Δx * Δz / Δy, Δx * Δy / Δz)

# Get a local array of the solution vector
PETSc.withlocalarray!(g_x; write = false) do l_x
# reshape so we can use multi-D indexing
x = @view PETSc.reshapelocalarray(l_x, da)[1, :, :, :]

# loop over indices and set the function value
for ind in ((corners.lower):(corners.upper))
# If on the boundary just set equal to the incoming data
# otherwise apply the finite difference operator
if any(ntuple(j -> ind[j] == 1 || ind[j] == Nq[j], dim))
J[ind, ind] = 1
else
# We accumulate the diagonal and add it at the end
Jii = -Δx * Δy * Δz * λ * exp(x[ind]) # Apply the source

# Apply the finite diffference stencil
for (s, w) in zip(stencils[1:dim], weights[1:dim])
Jii += w * 2
J[ind, ind + s[1]] = -w
J[ind, ind + s[3]] = -w
end
J[ind, ind] = Jii
end
end
end

# Assemble the Jacobian matrix
PETSc.assemble!(J)
return 0
end

if MPI.Comm_rank(comm) == 0
println(@elapsed(PETSc.solve!(x, snes)))
else
PETSc.solve!(x, snes)
end
g = similar(x)
snes.f!(g, snes, x)
nm = norm(g)
if MPI.Comm_rank(comm) == 0
@show nm
end

# Do some clean up
PETSc.destroy(J)
PETSc.destroy(x)
PETSc.destroy(g)
PETSc.destroy(r)
PETSc.destroy(da)
PETSc.destroy(snes)

PETSc.finalize(petsclib)
Loading