1(ch_dmbase)= 2 3# DM Basics 4 5The previous chapters have focused on the core numerical solvers in PETSc. However, numerical solvers without efficient ways 6(in both human and machine time) of connecting the solvers to the mathematical models and discretizations, including grids (or meshes) 7that people wish to build their simulations on, 8will not get widely used. Thus PETSc provides a set of abstractions represented by the `DM` object to provide a powerful, comprehensive 9mechanism for translating the problem specification of a model and its discretization to the language and API of solvers. 10`DM` is an orphan initialism or orphan acronym, the letters have no meaning and never did. 11 12Some of the model 13classes `DM` currently supports are PDEs on structured and staggered grids with finite difference methods (`DMDA` -- {any}`sec_struct` 14and `DMSTAG` -- {any}`ch_stag`), 15PDEs on unstructured 16grids with finite element and finite volume methods (`DMPLEX` -- {any}`ch_unstructured`), PDEs on quad and octree-grids (`DMFOREST`), models on 17networks (graphs) such 18as the power grid or river networks (`DMNETWORK` -- {any}`ch_network`), and particle-in-cell simulations (`DMSWARM`). 19 20In previous chapters, we have demonstrated some simple usage of `DM` to provide the input for the solvers. In this chapter, and those that follow, 21we will dive deep into the capabilities of `DM`. 22 23It is possible to create a `DM` with 24 25``` 26DM dm; 27DMCreate(MPI_Comm comm, DM *dm); 28DMSetType(DM dm, DMType type); 29``` 30 31but more commonly, a `DM` is created with a type-specific constructor; the construction process for each type of `DM` is discussed 32in the sections on each `DMType`. This chapter focuses 33on commonalities between all the `DM` so we assume the `DM` already exists and we wish to work with it. 34 35As discussed earlier, a `DM` can construct vectors and matrices appropriate for a model and discretization and provide the mapping between the 36global and local vector representations. 37 38``` 39DMCreateLocalVector(DM dm,Vec *l); 40DMCreateGlobalVector(DM dm,Vec *g); 41DMGlobalToLocal(dm,g,l,INSERT_VALUES); 42DMLocalToGlobal(dm,l,g,ADD_VALUES); 43DMCreateMatrix(dm,Mat *m); 44``` 45 46The matrices produced may support `MatSetValuesLocal()` allowing one to work with the local numbering on each MPI rank. For `DMDA` one can also 47use `MatSetValuesStencil()` and for `DMSTAG` with `DMStagMatSetValuesStencil()`. 48 49A given `DM` can be refined for certain `DMType`s with `DMRefine()` or coarsened with `DMCoarsen()`. 50Mappings between `DM`s may be obtained with routines such as `DMCreateInterpolation()`, `DMCreateRestriction()` and `DMCreateInjection()`. 51 52One attaches a `DM` to a PETSc solver object, `KSP`, `SNES`, `TS`, or `Tao` with 53 54``` 55KSPSetDM(KSP ksp,DM dm); 56SNESSetDM(SNES snes,DM dm); 57TSSetDM(TS ts,DM dm); 58``` 59 60Once the `DM` is attached, the solver can utilize it to create and process much of the data that the solver needs to set up and implement its solve. 61For example, with `PCMG` simply providing a `DM` can allow it to create all the data structures needed to run geometric multigrid on your problem. 62 63<a href="PETSC_DOC_OUT_ROOT_PLACEHOLDER/src/snes/tutorials/ex19.c.html">SNES Tutorial ex19</a> demonstrates how this may be done with `DMDA`. 64 65See {any}`ch_dmcommonality` for an advanced discussion of the commonalities between the various `DM`. That material should be read after having read 66the material below for each of the `DM`. 67