Home
last modified time | relevance | path

Searched refs:reduce (Results 1 – 23 of 23) sorted by relevance

/petsc/src/ksp/pc/impls/tfs/
H A Dgs.c385 PetscInt **reduce; in gsi_via_bit_mask() local
409 reduce = gs->local_reduce; in gsi_via_bit_mask()
410 for (i = 0, t1 = 0; i < gs->num_local; i++, reduce++) { in gsi_via_bit_mask()
411 …if ((PCTFS_ivec_binary_search(**reduce, gs->pw_elm_list, gs->len_pw_list) >= 0) || PCTFS_ivec_bina… in gsi_via_bit_mask()
416 **reduce = map[**reduce]; in gsi_via_bit_mask()
820 PetscInt *num, *map, **reduce; in PCTFS_gs_gop_local_out() local
825 reduce = gs->gop_local_reduce; in PCTFS_gs_gop_local_out()
826 while ((map = *reduce++)) { in PCTFS_gs_gop_local_out()
849 PetscInt *num, *map, **reduce; in PCTFS_gs_gop_local_plus() local
854 reduce = gs->local_reduce; in PCTFS_gs_gop_local_plus()
[all …]
/petsc/config/BuildSystem/config/utilities/
H A DFPTrap.py3 from functools import reduce
35 …if reduce(lambda x,y: x and y, map(self.functions.check, ['fp_sh_trap_info', 'fp_trap', 'fp_enable…
/petsc/config/BuildSystem/
H A Dgraph.py3 from functools import reduce
19 …return 'DirectedGraph with '+str(len(self.vertices))+' vertices and '+str(reduce(lambda k,l: k+l, …
/petsc/src/binding/petsc4py/src/petsc4py/PETSc/
H A DSF.pyx464 """End a broadcast & reduce operation started with `bcastBegin`.
501 Values to reduce.
527 Values to reduce.
/petsc/doc/manual/
H A Dadvanced.md79 ordering for the matrix. The ordering generally is done to reduce fill
209 for numerical stability. This is because trying to both reduce fill and
H A Dregressor.md149 constructs a linear model to reduce the sum of squared differences
H A Dksp.md643 thus effectively "hiding" the time of the reductions. In addition, they may reduce the number of gl…
1209 number of active processes on coarse grids to reduce communication costs
1211 costs down. Most AMG solvers reduce to just one active process on the
1213 the coarse grid on all processes to reduce communication
H A Dmat.md1230 * - Ordering to reduce fill
1366 reduce fill in sparse matrix factorizations.
H A Dperformance.md528 `MatILUFactorSymbolic()` can reduce greatly the number of mallocs and
H A Dsnes.md949 are often introduced that significantly reduce these expenses and yet
H A Dvec.md1513 One may wish to gather the entries of the `leafdata` for each root but not reduce them to a single …
H A Dtao.md1274 sufficiently reduce the nonlinear objective function, then the step is
/petsc/src/mat/impls/aij/seq/mkl_pardiso/
H A Dmkl_pardiso.c353 …rSchur_Private(Mat_MKL_PARDISO *mpardiso, PetscScalar *whole, PetscScalar *schur, PetscBool reduce) in MatMKLPardisoScatterSchur_Private() argument
356 if (reduce) { /* data given for the whole matrix */ in MatMKLPardisoScatterSchur_Private()
/petsc/doc/changes/
H A D315.md146 which to reduce active processors on coarse grids in `PCGAMG` that
H A D2024.md355 `VecNormEnd()`, which reduce communication overhead in parallel;
/petsc/doc/install/
H A Dinstall_tutorial.md23 Don't need Fortran? Use `--with-fortran-bindings=0` to reduce the build times. If you
/petsc/doc/developers/
H A Dbuildsystem.md13 are mechanical operations that reduce to applying a construction rule to
/petsc/config/BuildSystem/config/
H A DsetCompilers.py6 from functools import reduce
18 return reduce(lambda x,y:x or y,lst,False)
/petsc/src/vec/vec/impls/seq/cupm/
H A Dvecseqcupm_impl.hpp2098 …#define THRUST_MINMAX_REDUCE(s, b, e, real_part__, ...) THRUST_CALL(thrust::reduce, s, b, e, __VA_… in MinMax_()
2187 PetscCallThrust(*sum = THRUST_CALL(thrust::reduce, stream, dptr, dptr + n, PetscScalar{0.0});); in Sum()
/petsc/src/ksp/ksp/tutorials/output/
H A Dex2_help.out287 …-pc_factor_mat_ordering_type <now natural : formerly natural>: Reordering to reduce nonzeros in fa…
/petsc/lib/petsc/bin/maint/
H A Dtoclapack.sh2522 /* If this is true, then we need to reduce EMAX by one because */
4513 /* If this is true, then we need to reduce EMAX by one because */
/petsc/doc/faq/
H A Dindex.md126 ### Does all the PETSc error checking and logging reduce PETSc's efficiency?
321 ### The PETSc distribution is SO Large. How can I reduce my disk space usage?
/petsc/share/petsc/datafiles/meshes/
H A Dtestcase3D.cas238 (mixing-plane/reduce-backflow? #f)
5470 (morpher/reduce-bb-factor 0.8)
7543 (cache-flush/target/reduce-by-mb 0)