Lines Matching refs:leafdata
1391 `PetscSF` communicates between `rootdata` and `leafdata` arrays. `rootdata` is distributed across t…
1403 …leafdata` is similar to PETSc local vectors; each MPI process's `leafdata` array can contain "ghos…
1416 // provide the matching location in rootdata for each entry in leafdata
1444 PetscInt *rootdata, *leafdata;
1454 Finally, we use the `PetscSF` to communicate `rootdata` to `leafdata`:
1457 PetscSFBcastBegin(sf, MPIU_INT, rootdata, leafdata, MPI_REPLACE);
1458 PetscSFBcastEnd(sf, MPIU_INT, rootdata, leafdata, MPI_REPLACE);
1461 Now `leafdata` on MPI rank 0 contains (1, 3, 2) and on MPI rank 1 contains (1, 2, 3).
1463 It is also possible to move `leafdata` to `rootdata` using
1466 PetscSFReduceBegin(sf, MPIU_INT, leafdata, rootdata, MPIU_SUM);
1467 PetscSFReduceEnd(sf, MPIU_INT, leafdata, rootdata, MPIU_SUM);
1476 ## Non-contiguous storage of leafdata
1478 In the example above we treated the `leafdata` as sitting in a contiguous array with entries from 0…
1479 `NULL` argument in the call to `PetscSFSetGraph()`. More generally the `leafdata` array can have en…
1496 …ee entries of `leafdata` affected by `PetscSF` communication on MPI rank 0 are the array locations…
1497 … values from the three roots listed in `roots` are placed backwards in `leafdata`. Note that provi…
1501 MPI rank 1 (3, 2, 1) where x indicates the previous value in `leafdata` that was unchanged.
1507 `rootdata` and `leafdata` can live either on CPU memory or GPU memory. The `PetscSF` routines autom…
1511 ## Gathering leafdata but not reducing it
1513 One may wish to gather the entries of the `leafdata` for each root but not reduce them to a single …
1516 PetscSFGatherBegin(sf, MPIU_INT, leafdata, multirootdata);
1517 PetscSFGatherEnd(sf, MPIU_INT, leafdata, multirootdata);
1529 The data in `multirootdata` can be communicated to `leafdata` using
1532 PetscSFScatterBegin(sf, MPIU_INT, multirootdata, leafdata);
1533 PetscSFScatterEnd(sf, MPIU_INT, multirootdata, leafdata);