1 #pragma once 2 3 /* SUBMANSEC = IS */ 4 5 /*S 6 IS - Abstract PETSc object used for efficient indexing into vector and matrices 7 8 Level: beginner 9 10 .seealso: `ISType`, `ISCreateGeneral()`, `ISCreateBlock()`, `ISCreateStride()`, `ISGetIndices()`, `ISDestroy()` 11 S*/ 12 typedef struct _p_IS *IS; 13 14 /*S 15 ISLocalToGlobalMapping - mappings from a 16 local ordering (on individual MPI processes) of 0 to n-1 to a global PETSc ordering (across collections of MPI processes) 17 used by a vector or matrix. 18 19 Level: intermediate 20 21 Note: 22 Mapping from local to global is scalable; but global 23 to local may not be if the range of global values represented locally 24 is very large. `ISLocalToGlobalMappingType` provides alternative ways of efficiently applying `ISGlobalToLocalMappingApply() 25 26 Developer Note: 27 `ISLocalToGlobalMapping` is actually a private object; it is included 28 here for the inline function `ISLocalToGlobalMappingApply()` to allow it to be inlined since 29 it is used so often. 30 31 .seealso: `ISLocalToGlobalMappingCreate()`, `ISLocalToGlobalMappingApply()`, `ISLocalToGlobalMappingDestroy()`, `ISGlobalToLocalMappingApply()` 32 S*/ 33 typedef struct _p_ISLocalToGlobalMapping *ISLocalToGlobalMapping; 34 35 /*S 36 ISColoring - sets of `IS`s that define a coloring of something, such as a graph defined by a sparse matrix 37 38 Level: intermediate 39 40 Notes: 41 One should not access the *is records below directly because they may not yet 42 have been created. One should use `ISColoringGetIS()` to make sure they are 43 created when needed. 44 45 When the coloring type is `IS_COLORING_LOCAL` the coloring is in the local ordering of the unknowns. 46 That is the matching the local (ghosted) vector; a local to global mapping must be applied to map 47 them to the global ordering. 48 49 Developer Note: 50 This is not a `PetscObject` 51 52 .seealso: `IS`, `MatColoringCreate()`, `MatColoring`, `ISColoringCreate()`, `ISColoringGetIS()`, `ISColoringView()` 53 S*/ 54 typedef struct _n_ISColoring *ISColoring; 55 56 /*S 57 PetscLayout - defines layout of vectors and matrices (that is the "global" numbering of vector and matrix entries) across MPI processes (which rows are owned by which processes) 58 59 Level: developer 60 61 Notes: 62 PETSc vectors (`Vec`) have a global number associated with each vector entry. The first MPI process that shares the vector owns the first `n0` entries of the vector, 63 the second MPI process the next `n1` entries, etc. A `PetscLayout` is a way of managing this information, for example the number of locally owned entries is provided 64 by `PetscLayoutGetLocalSize()` and the range of indices for a given MPI process is provided by `PetscLayoutGetRange()`. 65 66 Each PETSc `Vec` contains a `PetscLayout` object which can be obtained with `VecGetLayout()`. For convenience `Vec` provides an API to access the layout information directly, 67 for example with `VecGetLocalSize()` and `VecGetOwnershipRange()`. 68 69 Similarly PETSc matrices have layouts, these are discussed in [](ch_matrices). 70 71 .seealso: `PetscLayoutCreate()`, `PetscLayoutDestroy()`, `PetscLayoutGetRange()`, `PetscLayoutGetLocalSize()`, `PetscLayoutGetSize()`, 72 `PetscLayoutGetBlockSize()`, `PetscLayoutGetRanges()`, `PetscLayoutFindOwner()`, `PetscLayoutFindOwnerIndex()`, 73 `VecGetLayout()`, `VecGetLocalSize()`, `VecGetOwnershipRange()` 74 S*/ 75 typedef struct _n_PetscLayout *PetscLayout; 76