GridapPETSc.jl extension
Building on top of GridapPETSc.jl, GridapSolvers provides examples of complex solvers one can build.
Elasticity solver
GridapSolvers.PETScElasticitySolver
— FunctionPETScElasticitySolver(space::FESpace; maxiter=500, atol=1.e-12, rtol=1.e-8)
GMRES + AMG solver, specifically designed for linear elasticity problems.
Follows PETSc's documentation for PCAMG and MatNullSpaceCreateRigidBody.
GridapSolvers.PETScElasticitySolver
— MethodPETScElasticitySolver(space::FESpace; maxiter=500, atol=1.e-12, rtol=1.e-8)
GMRES + AMG solver, specifically designed for linear elasticity problems.
Follows PETSc's documentation for PCAMG and MatNullSpaceCreateRigidBody.
An example on how to use it on an elasticity problem can be found in 'test/Applications/Elasticity.jl'.
HPDDM solver
We also provide support for the HPDDM library through PETSc's PCHPDDM
preconditioner:
GridapSolvers.HPDDMLinearSolver
— FunctionHPDDMLinearSolver(
ranks::MPIArray,mat::PETScMatrix,is::PETScIndexSet[,ksp_setup[,pc_setup]];
maxiter=500, atol=1.e-12, rtol=1.e-8
)
Wrapper for a PETScLinearSolver
preconditioned with the HPDDM library.
Arguments
indices::MPIArray
: For each rank, the local-to-global index map for the matrix rows/cols.mats::MPIArray
: For each rank, the matrix for the local overlapping Neumann problem.ksp_setup::Function
: PETSc setup options for the KSP solver.pc_setup::Function
: Extra setup options for the PCHPDDM preconditioner.
The defaults for ksp_setup
and pc_setup
set options from the command line, using the following functions:
function hpddm_default_setup_ksp(ksp::Ref{PETSC.KSP})
@check_error_code GridapPETSc.PETSC.KSPSetFromOptions(ksp[])
end
function hpddm_default_setup_pc(pc::Ref{PETSC.PC})
@check_error_code PETSC.PCSetFromOptions(pc[])
end
To modify the default setup, you can pass your own functions (with the same signatures) as arguments to the constructor.
HPDDMLinearSolver(space::FESpace,biform::Function[,args...];kwargs...)
Creates a HPDDMLinearSolver
from a finite element space and a bilinear form for the local overlapping Neumann problems. The extra arguments are the same as for the low-level constructor.
To have overlapping Neumann problems, the Measure
has to be modified to include ghost cells. For instance, for a Poisson problem we would have:
Ωg = Triangulation(with_ghost,model)
dΩg = Measure(Ωg,qdegree)
a(u,v) = ∫(∇(u)⋅∇(v))dΩg
An example on how to use it on a Poisson problem can be found in 'test/ExtLibraries/drivers/HPDDMTests.jl'.
Caching PETSc solvers
To have zero allocations when solving linear systems, one needs to pre-allocate PETSc arrays for the solution and right-hand side. We provide a way to automate this process:
GridapSolvers.CachedPETScNS
— FunctionCachedPETScNS(ns::PETScLinearSolverNS,x::AbstractVector,b::AbstractVector)
Wrapper around a PETSc NumericalSetup, providing highly efficiend reusable caches:
When converting julia vectors/PVectors to PETSc vectors, we purposely create aliasing of the vector values. This means we can avoid copying data from one to another before solving, but we need to be careful about it.
This structure takes care of this, and makes sure you do not attempt to solve the system with julia vectors that are not the ones you used to create the solver cache. Once this structure is created, you can only solve the system with the same vectors you used to create it.