Searched refs:PCMPI (Results 1 – 12 of 12) sorted by relevance
| /petsc/include/ |
| H A D | petscpctypes.h | 80 #define PCMPI "mpi" macro
|
| /petsc/src/ksp/ksp/impls/preonly/ |
| H A D | preonly.c | 19 PetscCall(PetscObjectTypeCompareAny((PetscObject)ksp->pc, &flg, PCREDISTRIBUTE, PCMPI, "")); in KSPSolve_PREONLY()
|
| /petsc/doc/manual/ |
| H A D | about_this_manual.md | 37 by utilizing modest numbers of MPI processes. See `PCMPI` for details on how to
|
| H A D | streams.md | 290 `PCMPI` has two approaches for distributing the linear system. The first uses `MPI_Scatterv()` to c… 292 the second communication mechanism is Unix shared memory `shmget()`. Here, `PCMPI` allocates shared…
|
| H A D | ksp.md | 2652 See `PCMPI`, `PCMPIServerBegin()`, and `PCMPIServerEnd()` for more details on the solvers. 2661 See {any}`sec_pcmpi_study` for a study of the use of `PCMPI` on a specific PETSc application.
|
| /petsc/src/ksp/pc/interface/ |
| H A D | pcregis.c | 160 PetscCall(PCRegister(PCMPI, PCCreate_MPI)); in PCRegisterAll()
|
| /petsc/src/ksp/ksp/interface/ |
| H A D | itfunc.c | 342 PetscCall(PetscObjectTypeCompare((PetscObject)pc, PCMPI, &pcmpi)); in KSPSetUp() 1093 PetscCall(PetscObjectTypeCompare((PetscObject)ksp->pc, PCMPI, &isPCMPI)); in KSPSolve() 2224 PetscCall(PetscObjectTypeCompare((PetscObject)ksp->pc, PCMPI, &isPCMPI)); in KSPCheckPCMPI() 2233 PetscCall(PCSetType(ksp->pc, PCMPI)); in KSPCheckPCMPI()
|
| H A D | itcl.c | 47 PetscCall(PetscObjectTypeCompare((PetscObject)ksp->pc, PCMPI, &ispcmpi)); in KSPSetOptionsPrefix()
|
| /petsc/doc/overview/ |
| H A D | linear_solve_table.md | 241 - ``PCMPI``
|
| /petsc/doc/changes/ |
| H A D | 320.md | 164 - Refactor `PCMPI` to be a private system used automatically when `-mpi_linear_solver_server` is us…
|
| /petsc/src/ksp/pc/impls/mpi/ |
| H A D | pcmpi.c | 975 PetscCall(PetscObjectChangeTypeName((PetscObject)pc, PCMPI)); in PCCreate_MPI()
|
| /petsc/doc/faq/ |
| H A D | index.md | 1205 Another approach that allows using a PETSc parallel solver is to use `PCMPI`.
|