History log of /petsc/src/sys/objects/pinit.c (Results 301 – 325 of 1307)
Revision Date Author Comments
# 605fe76d 31-May-2021 Satish Balay <balay@mcs.anl.gov>

Merge branch 'stefanozampini/add-proper-datatypes-for-composite-reductions' into 'main'

MPI: define proper datatypes for composite reductions

See merge request petsc/petsc!4046


# 092991ac 31-May-2021 Stefano Zampini <stefano.zampini@gmail.com>

MPI: define proper datatypes for composite reductions

this was a long-standing bug, dating back to 1996 or so


# 2a27bf02 30-Apr-2021 Stefano Zampini <stefano.zampini@gmail.com>

Fix compiler warnings


# 3c124016 24-May-2021 Satish Balay <balay@mcs.anl.gov>

Merge branch 'barry/2021-04-12/support-for-nvhpc' into 'main'

These are fixes needed for clean builds with NVIDIA's HPC software toolkit

See merge request petsc/petsc!3882


# 5162e2cf 24-May-2021 Barry Smith <bsmith@mcs.anl.gov>

Various fixes for NVIDIA's nvhpc toolkit

The nvhpc systems separates the CUDA math libaries and includes from the regular CUDA material
thus cuda.py has to detect and manage this separation automati

Various fixes for NVIDIA's nvhpc toolkit

The nvhpc systems separates the CUDA math libaries and includes from the regular CUDA material
thus cuda.py has to detect and manage this separation automatically.

Note also the nvhpc includes the MPI compilers in another directory which should be used with MPI.

Do not use outer C++ compiler as Kokkos compiler as that may not match the compiler used by nvcc, instead use the C++ compiler that nvcc uses. This is tricky because this compiler has to have access to the MPI headers since it sees all of the PETSc headers.

/spend 14h

show more ...


# a8cf78f8 24-May-2021 Satish Balay <balay@mcs.anl.gov>

Merge branch 'barry/2021-05-16/fix-double-lines' into 'main'

Fix typos in source

See merge request petsc/petsc!3984


# 4e278199 16-May-2021 Barry Smith <bsmith@mcs.anl.gov>

Remove all double blank lines from source

Commit-type: petsc-style
/2h


# 7d01355a 05-May-2021 Satish Balay <balay@mcs.anl.gov>

Merge branch 'hongzh/add-cuda-event-timer' into 'main'

Add CUDA and HIP event timers

See merge request petsc/petsc!3852


# 9ffd0706 19-Dec-2020 Hong Zhang <hongzhang@anl.gov>

Use GPU event timer instead of CPU timer

- Support CUDA and HIP
- Initialize CUDA event timers along with CUDA
- Update dev changes
- Package-specific includes should not be placed in public include

Use GPU event timer instead of CPU timer

- Support CUDA and HIP
- Initialize CUDA event timers along with CUDA
- Update dev changes
- Package-specific includes should not be placed in public include files.

show more ...


# d7bfff7c 03-May-2021 Satish Balay <balay@mcs.anl.gov>

Merge remote-tracking branch 'origin/release'


# c75fb37d 30-Apr-2021 Satish Balay <balay@mcs.anl.gov>

Merge branch 'jose/release/fix-complex-float128' into 'release'

Fix a bug when configured with complex scalars and precision=__float128

See merge request petsc/petsc!3936


# c5481aee 30-Apr-2021 Jose E. Roman <jroman@dsic.upv.es>

Fix a bug when configured with complex scalars and precision=__float128

Failure was seen even in 'make check'


# 16f8cdc7 09-Apr-2021 Satish Balay <balay@mcs.anl.gov>

Merge remote-tracking branch 'origin/barry/2021-03-28/fix-mpiu-allreduce-mpi-failure'

Fix capture of MPI error code in MPIU_Allreduce()

See merge request petsc/petsc!3777


# 820f2d46 03-Apr-2021 Barry Smith <bsmith@mcs.anl.gov>

All MPIU_ functions except MPIU_File return MPI error codes for checking

Update checkbadSource to find use of CHKERRQ with MPIU_ functions

Commit-type: i.e. error-checking, optimization, bug-fix, p

All MPIU_ functions except MPIU_File return MPI error codes for checking

Update checkbadSource to find use of CHKERRQ with MPIU_ functions

Commit-type: i.e. error-checking, optimization, bug-fix, portability-fix, testing-fix, style-fix, feature, documentation, example
Funded-by:
Project:
Time: hours
Reported-by:
Thanks-to:
Development Tools: Vim, Emacs, Eclipse

show more ...


# 39e0b364 02-Apr-2021 Satish Balay <balay@mcs.anl.gov>

Merge branch 'haplav/remove-adios2' into 'main'

remove nonworking PETSCVIEWERADIOS2 impl

Closes #867

See merge request petsc/petsc!3791


# ed5e8ed5 29-Mar-2021 Vaclav Hapla <vaclav.hapla@erdw.ethz.ch>

remove nonworking PETSCVIEWERADIOS2 impl


# f19838c9 10-Mar-2021 Satish Balay <balay@mcs.anl.gov>

Merge branch 'jczhang/feature-nvshmem' into 'main'

Add SF NVSHMEM

See merge request petsc/petsc!3474


# 71438e86 06-Jan-2021 Junchao Zhang <jczhang@mcs.anl.gov>

Add SF NVSHMEM support


# a0e72f99 07-Jan-2021 Junchao Zhang <jczhang@mcs.anl.gov>

Add non-null PetscDefault{Cuda,Hip}Stream


# 1cd963cc 07-Mar-2021 Satish Balay <balay@mcs.anl.gov>

Merge branch 'balay/fix-mswin-C_DOUBLE_COMPLEX-check' into 'main'

MSWIN: set HAVE_MPI_C_DOUBLE_COMPLEX on successful compile, and always compile PetscSum_Local() etc MPI utilities

See merge request

Merge branch 'balay/fix-mswin-C_DOUBLE_COMPLEX-check' into 'main'

MSWIN: set HAVE_MPI_C_DOUBLE_COMPLEX on successful compile, and always compile PetscSum_Local() etc MPI utilities

See merge request petsc/petsc!3672

show more ...


# de272c7a 05-Mar-2021 Satish Balay <balay@mcs.anl.gov>

remove PETSC_HAVE_MPI_C_DOUBLE_COMPLEX


# 7a19d461 04-Mar-2021 Satish Balay <balay@mcs.anl.gov>

PETSC_HAVE_COMPLEX: set flag only if CLANGUAGE (i.e library build language) supports complex, and cleanup logic wrt c/c++, library/user code.
Also cleanup dependent code (wrt PETSC_HAVE_COMPLEX usage

PETSC_HAVE_COMPLEX: set flag only if CLANGUAGE (i.e library build language) supports complex, and cleanup logic wrt c/c++, library/user code.
Also cleanup dependent code (wrt PETSC_HAVE_COMPLEX usage).

show more ...


# 91d58987 03-Mar-2021 Satish Balay <balay@mcs.anl.gov>

Merge branch 'release' into main


# 3c899108 03-Mar-2021 Satish Balay <balay@mcs.anl.gov>

Merge branch 'balay/fix-fujitsu-mpi-version-check' into 'release'

Openmpi: fix check for Fujitsu MPI

See merge request petsc/petsc!3671


# 16dc8964 02-Mar-2021 Satish Balay <balay@mcs.anl.gov>

OpenMPI: fix check for Fujitsu MPI


1...<<11121314151617181920>>...53