Searched refs:PCGAMG (Results 1 – 19 of 19) sorted by relevance
| /petsc/include/ |
| H A D | petscpctypes.h | 67 #define PCGAMG "gamg" macro
|
| /petsc/doc/changes/ |
| H A D | 315.md | 146 which to reduce active processors on coarse grids in `PCGAMG` that 156 using `PCMG` and `PCGAMG` 163 solvers with inner solvers such as `PCMG`, `PCGAMG`, `PCFIELDSPLIT`. 165 solvers with inner solvers such as `PCMG`, `PCGAMG`, `PCFIELDSPLIT`.
|
| H A D | 311.md | 57 previous has. Note that the multigrid solvers (PCMG, PCGAMG, PCML)
|
| H A D | 35.md | 81 - PCGAMG default smoother changed from PCJACOBI to PCSOR.
|
| H A D | 317.md | 154 - Change `PCGAMG` default to use `PCJACOBI` smoothing instead of `PCSOR`. This also allows the defa… 155 - Change `PCGAMG` eigenvalue estimation to use `KSPCG` when `MAT_SPD` has been set (see `MatSetOpti… 156 - Change `PCGAMG` to use `PCGAMGSetUseSAEstEig()` by default when the smoother uses Jacobi precondi…
|
| H A D | 322.md | 97 - Change the option database keys for coarsening for `PCGAMG` to use the prefix `-pc_gamg_`, for ex…
|
| H A D | 321.md | 110 - Change API for several PetscCD methods used internally in `PCGAMG` and `MatCoarsen` (eg, change `…
|
| /petsc/src/ksp/ksp/tests/ |
| H A D | lostnullspace.c | 78 PetscCall(PCSetType(pc_mech, PCGAMG)); in main()
|
| /petsc/src/ksp/pc/interface/ |
| H A D | pcregis.c | 120 PetscCall(PCRegister(PCGAMG, PCCreate_GAMG)); in PCRegisterAll()
|
| /petsc/src/ksp/ksp/tutorials/ |
| H A D | ex4.c | 116 PetscCall(PCHMGSetInnerPCType(pc, PCGAMG)); in main()
|
| /petsc/doc/manual/ |
| H A D | ksp.md | 713 multiple multigrid solvers/preconditioners including `PCMG`, `PCGAMG`, `PCHYPRE`, 749 - ``PCGAMG`` 1076 PETSc has a native algebraic multigrid preconditioner `PCGAMG` – 1086 construction. `PCGAMG` is designed from the beginning to be modular, to 1090 (`-pc_type gamg -pc_gamg_type agg` or `PCSetType(pc,PCGAMG)` and 1094 {cite}`isaacstadlerghattas2015` supports 2D coarsenings extruded in the third dimension. `PCGAMG` d… 1169 `PCGAMG` provides unsmoothed aggregation (`-pc_gamg_agg_nsmooths 0`) and 1234 code using `PCGAMG` with `-help` to get a full listing of GAMG 1251 The parallel MIS algorithms require symmetric weights/matrices. Thus `PCGAMG` 1281 **Troubleshooting algebraic multigrid methods:** If `PCGAMG`, *ML*, *AMGx* or [all …]
|
| H A D | streams.md | 242 …ear system is solved with the PETSc algebraic multigrid preconditioner, `PCGAMG` and Krylov accele… 244 `-da_refine 6 -pc_type gamg -log_view`. This study did not attempt to tune the default `PCGAMG` par…
|
| /petsc/src/ksp/pc/impls/hmg/ |
| H A D | hmg.c | 130 PetscCall(PetscStrallocpy(PCGAMG, &hmg->innerpctype)); in PCSetUp_HMG()
|
| /petsc/doc/overview/ |
| H A D | linear_solve_table.md | 149 - ``PCGAMG``
|
| /petsc/src/binding/petsc4py/src/petsc4py/PETSc/ |
| H A D | petscpc.pxi | 40 PetscPCType PCGAMG
|
| H A D | PC.pyx | 41 GAMG = S_(PCGAMG)
|
| /petsc/src/ksp/pc/impls/gamg/ |
| H A D | gamg.c | 1926 PetscCall(PetscObjectChangeTypeName((PetscObject)pc, PCGAMG)); in PCCreate_GAMG()
|
| /petsc/src/ts/tutorials/ |
| H A D | ex30.c | 1769 PetscCall(PCSetType(pc, PCGAMG)); in SetInitialConditionsAndTolerances()
|
| /petsc/doc/faq/ |
| H A D | index.md | 264 - Parts of most preconditioners run directly on the GPU. After setup, `PCGAMG` runs
|