1(ch_petscsection)= 2 3# PetscSection: Connecting Grids to Data 4 5The strongest links between solvers and discretizations are 6 7- the relationship between the layout of data over a mesh (or similar structure) and the data layout in arrays and `Vec` used for computation, 8- data partitioning, and 9- ordering of data. 10 11To enable modularity, we encode the operations above in simple data 12structures that can be understood by the linear algebraic and solver components of PETSc (`Vec`, `Mat`, `KSP`, `PC`, `SNES`, `TS`, `Tao`, `PetscRegressor`) 13without explicit reference to the mesh (topology) or discretization (analysis). 14 15While `PetscSection` is currently only employed for `DMPlex`, `DMForest`, and `DMNetwork` mesh descriptions, much of its operation is general enough to be utilized for other types of discretizations. 16This section will explain the basic concepts of a `PetscSection` that are generalizable to other mesh descriptions. 17 18(sec_petscsection_concept)= 19 20## General concept 21 22Specific entries (or collections of entries) in a `Vec` (or a simple array) can be associated with a "location" on a mesh (or other types of data structure) using the `PetscSection` object. 23A **point** is a `PetscInt` that serves as an abstract "index" into arrays from iterable sets, such as k-cells in a mesh. 24Other iterable set examples can be as simple as the points of a finite difference grid, or cells of a finite volume grid, or as complex as the topological entities of an unstructured mesh (cells, faces, edges, and vertices). 25 26At it's most basic, a `PetscSection` is a mapping between the mesh points and a tuple `(ndof, offset)`, where `ndof` is the number of values stored at that mesh point and `offset` is the location in the array of that data. 27So given the tuple for a mesh point, its data can be accessed by `array[offset + d]`, where `d` in `[0, ndof)` is the dof to access. 28 29### Charts: Defining mesh points 30 31The mesh points for a `PetscSection` must be contiguously numbered and are defined to be in some range $[\mathrm{pStart}, \mathrm{pEnd})$, which is called a **chart**. 32The chart of a `PetscSection` is set via `PetscSectionSetChart()`. 33Note that even though the mesh points must be contiguously numbered, the indexes into the array (defined by each `(ndof, offset)` tuple) associated with the `PetscSection` need not be. 34In other words, there may be elements in the array that are not associated with any mesh points, though this is not often the case. 35 36### Defining the (ndof, offset) tuple 37 38Defining the `(ndof, offset)` tuple for each mesh point generally first starts with setting the `ndof` for each point, which is done using `PetscSectionSetDof()`. 39This associates a set of degrees of freedom (dof), (a small space $\{e_k\}\ 0 < k < ndof$), with every point. 40If `ndof` is not set for a mesh point, it is assumed to be 0. 41 42The offset for each mesh point is usually set automatically by `PetscSectionSetUp()`. 43This will concatenate each mesh point's dofs together in the order of the mesh points. 44This concatenation can be done in a different order by setting a permutation, which is described in {any}`sec_petscsection_permutation`. 45 46Alternatively, the offset for each mesh point can be set manually by `PetscSectionSetOffset()`, though this is not commonly needed. 47 48Once the tuples are created, the `PetscSection` is ready to use. 49 50### Basic Setup Example 51 52To summarize, the sequence for constructing a basic `PetscSection` is the following: 53 541. Specify the range of points, or chart, with `PetscSectionSetChart()`. 552. Specify the number of dofs per point with `PetscSectionSetDof()`. Any values not set will be zero. 563. Set up the `PetscSection` with `PetscSectionSetUp()`. 57 58## Multiple Fields 59 60In many discretizations, it is useful to differentiate between different kinds of dofs present on a mesh. 61For example, a dof attached to a cell point might represent pressure while dofs on vertices might represent velocity or displacement. 62A `PetscSection` can represent this additional structure with what are called **fields**. 63**Fields** are indexed contiguously from `[0, num_fields)`. 64To set the number of fields for a `PetscSection`, call `PetscSectionSetNumFields()`. 65 66Internally, each field is stored in a separate `PetscSection`. 67In fact, all the concepts and functions presented in {any}`sec_petscsection_concept` were actually applied onto the **default field**, which is indexed as `0`. 68The fields inherit the same chart as the "parent" `PetscSection`. 69 70### Setting Up Multiple Fields 71 72Setup for a `PetscSection` with multiple fields is nearly identical to setup for a single field. 73 74The sequence for constructing such a `PetscSection` is the following: 75 761. Specify the range of points, or chart, with `PetscSectionSetChart()`. All fields share the same chart. 772. Specify the number of fields with `PetscSectionSetNumFields()`. 783. Set the number of dof for each point on each field with `PetscSectionSetFieldDof()`. Any values not set will be zero. 794. Set the **total** number of dof for each point with `PetscSectionSetDof()`. Thus value must be greater than or equal to the sum of the values set with 80 `PetscSectionSetFieldDof()` at that point. Again, values not set will be zero. 815. Set up the `PetscSection` with `PetscSectionSetUp()`. 82 83### Point Major or Field Major 84 85A `PetscSection` with one field and and offsets set in `PetscSectionSetUp()` may be thought of as defining a two dimensional array indexed by point in the outer dimension with a variable length inner dimension indexed by the dof at that point: $v[\mathrm{pStart} <= point < \mathrm{pEnd}][0 <= dof < \mathrm{ndof}]$ [^petscsection-footnote]. 86 87With multiple fields, this array is now three dimensional, with the outer dimensions being both indexed by mesh points and field points. 88Thus, there is a choice on whether to index by points first, or by fields first. 89In other words, will the array be laid out in a point-major or field-major fashion. 90 91Point-major ordering corresponds to $v[\mathrm{pStart} <= point < \mathrm{pEnd}][0 <= field < \mathrm{num\_fields}][0 <= dof < \mathrm{ndof}]$. 92All the dofs for each mesh point are stored contiguously, meaning the fields are **interlaced**. 93Field-major ordering corresponds to $v[0 <= field < \mathrm{num\_fields}][\mathrm{pStart} <= point < \mathrm{pEnd}][0 <= dof < \mathrm{ndof}]$. 94The all the dofs for each field are stored contiguously, meaning the points are **interlaced**. 95 96Consider a `PetscSection` with 2 fields and 2 points (from 0 to 2). Let the 0th field have `ndof=1` for each point and the 1st field have `ndof=2` for each point. 97Denote each array entry $(p_i, f_i, d_i)$ for $p_i$ being the ith point, $f_i$ being the ith field, and $d_i$ being the ith dof. 98 99Point-major order would result in: 100 101$$ 102[(p_0, f_0, d_0), (p_0, f_1, d_0), (p_0, f_1, d_1),\\ (p_1, f_0, d_0), (p_1, f_1, d_0), (p_1, f_1, d_1)] 103$$ 104 105Conversely, field-major ordering would result in: 106 107$$ 108[(p_0, f_0, d_0), (p_1, f_0, d_0),\\ (p_0, f_1, d_0), (p_0, f_1, d_1), (p_1, f_1, d_0), (p_1, f_1, d_1)] 109$$ 110 111Note that dofs are always contiguous, regardless of the outer dimensional ordering. 112 113Setting the which ordering is done with `PetscSectionSetPointMajor()`, where `PETSC_TRUE` sets point-major and `PETSC_FALSE` sets field major. 114 115**NOTE:** The current default is for point-major, and many operations on `DMPlex` will only work with this ordering. Field-major ordering is provided mainly for compatibility with external packages, such as LibMesh. 116 117## Working with data 118 119Once a `PetscSection` has been created one can use `PetscSectionGetStorageSize()` to determine the total number of entries that can be stored in an array or `Vec` accessible by the `PetscSection`. 120This is most often used when creating a new `Vec` for a `PetscSection` such as: 121 122``` 123PetscSectionGetStorageSize(s, &n); 124VecSetSizes(localVec, n, PETSC_DETERMINE); 125VecSetFromOptions(localVec); 126``` 127 128The memory locations in the associated array are found using an **offset** which can be obtained with: 129 130Single-field `PetscSection`: 131 132``` 133PetscSectionGetOffset(PetscSection, PetscInt point, PetscInt &offset); 134``` 135 136Multi-field `PetscSection`: 137 138``` 139PetscSectionGetFieldOffset(PetscSection, PetscInt point, PetscInt field, PetscInt &offset); 140``` 141 142The value in the array is then accessed with `array[offset + d]`, where `d` in `[0, ndof)` is the dof to access. 143 144## Global Sections: Constrained and Distributed Data 145 146To handle distributed data and data with constraints, we use a pair of `PetscSections` called the `localSection` and `globalSection`. 147Their use for each is described below. 148 149### Distributed Data 150 151`PetscSection` can also be applied to distributed problems as well. 152This is done using the same local/global system described in {any}`sec_localglobal`. 153To do this, we introduce three new concepts; a `localSection`, `globalSection`, `pointSF`, and `sectionSF`. 154 155Assume the mesh points of the "global" mesh are partitioned among processes and that some mesh points are shared between multiple processes (i.e there is an overlap in the partitions). 156The shared mesh points define the ghost/halo points needed in many PDE problems. 157For each shared mesh point, appoint one process to be the owner of that mesh point. 158To describe this parallel mesh point layout, we use a `PetscSF` and call it the `pointSF`. 159The `pointSF` describes which processes "own" which mesh points and which process is the owner of each shared mesh point. 160 161Next, for each process define a `PetscSection` that describes the mapping between that process's partition (including shared mesh points) and the data stored on it and call it the `localSection`. 162The `localSection` describes the layout of the local vector. 163To generate the `globalSection` we use `PetscSectionCreateGlobalSection()`, which takes the `localSection` and `pointSF` as inputs. 164The global section returns $-(dof+1)$ for the number of dofs on an unowned (ghost) point, and traditionally $-(off+1)$ for its offset on the owning process. 165This behavior of the offsets is controlled via an argument to `PetscSectionCreateGlobalSection()`. 166The `globalSection` can be used to create global vectors, just as the local section is used to create local vectors. 167 168To perform the global-to-local and local-to-global communication, we define `sectionSF` to be the `PetscSF` describing the mapping between the local and global vectors. 169This is generated via `PetscSFSetGraphSection()`. 170Using `PetscSFBcastBegin()` will send data from the global vector to the local vector, while `PetscSFReduceBegin()` will send data from the local vector to the global vector. 171 172If using `DM`, this entire process is done automatically. 173The `localSection`, `globalSection`, `pointSF`, and `sectionSF` on a `DM` can be obtained via `DMGetLocalSection()`, `DMGetGlobalSection()`, `DMGetPointSF()`, and `DMGetSectionSF()`, respectively. 174Additionally, communication from global to local vectors and vice versa can be done via `DMGlobalToLocal()` and `DMLocalToGlobal()` as described in {any}`sec_localglobal`. 175Note that not all `DM` types use this system, such as `DMDA` (see {any}`sec_struct`). 176 177### Constrained Data 178 179In addition to describing parallel data, the `localSection`/`globalSection` pair can be used to describe *constrained* dofs 180These constraints usually represent essential (Dirichlet) boundary conditions, or algebraic constraints. 181They are dofs that have a given fixed value, so they are present in local vectors for finite element/volume assembly or finite difference stencil application purposes, but generally absent from global vectors since they are not unknowns in the algebraic solves. 182 183Constraints should be indicated in the `localSection`. 184Use `PetscSectionSetConstraintDof()` to set the number of constrained dofs for a given point, and `PetscSectionSetConstraintIndices()` to indicate which dofs on the given point are constrained. 185This must be done before `PetscSectionCreateGlobalSection()` is called to create the `globalSection`. 186 187Note that it is possible to have constraints set in a `localSection`, but have the `globalSection` be generated to include those constraints. 188This is useful when doing some form of post-processing of a solution where you want to access all data (see `DMGetOutputDM()` for example). 189See `PetscSectionCreateGlobalSection()` for more details on this. 190 191(sec_petscsection_permutation)= 192 193## Permutation: Changing the order of array data 194 195By default, when `PetscSectionSetUp()` is called, the data laid out in the associated array is assumed to be in the same order of the grid points. 196For example, the DoFs associated with grid point 0 appear directly before grid point 1, which appears before grid point 2, etc. 197 198It may be desired to have a different the ordering of data in the array than the order of grid points defined by a section. 199For example, one may want grid points associated with the boundary of a domain to appear before points associated with the interior of the domain. 200 201This can be accomplished by either changing the indexes of the grid points themselves, or by informing the section of the change in array ordering. 202Either method uses an `IS` to define the permutation. 203 204To change the indices of the grid points, call `PetscSectionPermute()` to generate a new `PetscSection` with the desired grid point permutation. 205 206To just change the array layout without changing the grid point indexing, call `PetscSectionSetPermutation()`. 207This must be called before `PetscSectionSetUp()` and will only affect the calculation of the offsets for each grid point. 208 209% TODO: Add example to demonstrate the difference between the two permutation methods 210 211## DMPlex Specific Functionality: Obtaining data from the array 212 213A vanilla `PetscSection` (what's been described up till now) gives a relatively naive perspective on the underlying data; it doesn't describe how DoFs attached to a single grid point are ordered or how different grid points relate to each other. 214A `PetscSection` can store and use this extra information in the form of **closures**, **symmetries**, and **closure permutations**. 215These features currently target `DMPlex` and other unstructured grid descriptions. 216A description of those features will be left to {any}`ch_unstructured`. 217 218```{rubric} Footnotes 219``` 220 221[^petscsection-footnote]: A `PetscSection` can be thought of as a generalization of `PetscLayout`, in the same way that a fiber bundle is a generalization 222 of the normal Euclidean basis used in linear algebra. With `PetscLayout`, we associate a unit vector ($e_i$) with every 223 point in the space, and just divide up points between processes. 224 Conversely, `PetscSection` associates multiple unit vectors with every mesh point (one for each dof) and divides the mesh points between processes using a `PetscSF` to define the distribution. 225 226```{eval-rst} 227.. bibliography:: /petsc.bib 228 :filter: docname in docnames 229``` 230