xref: /petsc/doc/manual/dmbase.md (revision bcd4bb4a4158aa96f212e9537e87b40407faf83e)
1(ch_dmbase)=
2
3# DM Basics
4
5The previous chapters have focused on the core numerical solvers in PETSc. However, numerical solvers without efficient ways
6(in both human and machine time) of connecting the solvers to the mathematical models and discretizations, including grids (or meshes)
7that people wish to build their simulations on,
8will not get widely used. Thus PETSc provides a set of abstractions represented by the `DM` object to provide a powerful, comprehensive
9mechanism for translating the problem specification of a model and its discretization to the language and API of solvers.
10`DM` is an orphan initialism or orphan acronym, the letters have no meaning and never did.
11
12Some of the model
13classes `DM` currently supports are PDEs on structured and staggered grids with finite difference methods (`DMDA` and `DMSTAG` -- {any}`ch_stag`),
14PDEs on unstructured
15grids with finite element and finite volume methods (`DMPLEX` -- {any}`ch_unstructured`), PDEs on quad and octree-grids (`DMFOREST`), models on
16networks (graphs) such
17as the power grid or river networks (`DMNETWORK` -- {any}`ch_network`), and particle-in-cell simulations (`DMSWARM`).
18
19In previous chapters, we have demonstrated some simple usage of `DM` to provide the input for the solvers. In this chapter, and those that follow,
20we will dive deep into the capabilities of `DM`.
21
22It is possible to create a `DM` with
23
24```
25DM dm;
26DMCreate(MPI_Comm comm, DM *dm);
27DMSetType(DM dm, DMType type);
28```
29
30but more commonly, a `DM` is created with a type-specific constructor; the construction process for each type of `DM` is discussed
31in the sections on each `DMType`. This chapter focuses
32on commonalities between all the `DM` so we assume the `DM` already exists and we wish to work with it.
33
34As discussed earlier, a `DM` can construct vectors and matrices appropriate for a model and discretization and provide the mapping between the
35global and local vector representations.
36
37```
38DMCreateLocalVector(DM dm,Vec *l);
39DMCreateGlobalVector(DM dm,Vec *g);
40DMGlobalToLocal(dm,g,l,INSERT_VALUES);
41DMLocalToGlobal(dm,l,g,ADD_VALUES);
42DMCreateMatrix(dm,Mat *m);
43```
44
45The matrices produced may support `MatSetValuesLocal()` allowing one to work with the local numbering on each MPI rank. For `DMDA` one can also
46use `MatSetValuesStencil()` and for `DMSTAG` with `DMStagMatSetValuesStencil()`.
47
48A given `DM` can be refined for certain `DMType`s with `DMRefine()` or coarsened with `DMCoarsen()`.
49Mappings between `DM`s may be obtained with routines such as `DMCreateInterpolation()`, `DMCreateRestriction()` and `DMCreateInjection()`.
50
51One attaches a `DM` to a PETSc solver object, `KSP`, `SNES`, `TS`, or `Tao` with
52
53```
54KSPSetDM(KSP ksp,DM dm);
55SNESSetDM(SNES snes,DM dm);
56TSSetDM(TS ts,DM dm);
57```
58
59Once the `DM` is attached, the solver can utilize it to create and process much of the data that the solver needs to set up and implement its solve.
60For example, with `PCMG` simply providing a `DM` can allow it to create all the data structures needed to run geometric multigrid on your problem.
61
62<a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/snes/tutorials/ex19.c.html">SNES Tutorial ex19</a> demonstrates how this may be done with `DMDA`.
63