Home
last modified time | relevance | path

Searched +full:linux +full:- +full:viennacl (Results 1 – 6 of 6) sorted by relevance

/petsc/doc/overview/
H A Dfeatures.md5 - {ref}`HPC <doc_config_hpc>`
6 - {ref}`Linux <doc_config_faq>`
7 - {ref}`macOS <doc_config_faq>`
8 - {ref}`Microsoft Windows <doc_windows>`
12 - {ref}`Matrix/Vector CUDA support <doc_config_accel_cuda>`
13 - {ref}`Kokkos support <doc_config_accel_kokkos>`
14 - {ref}`Matrix/Vector OpenCL/ViennaCL support <doc_config_accel_opencl>`
15 - {ref}`Matrix/Vector HIP support <doc_gpu_roadmap>`
/petsc/doc/install/
H A Dinstall.md8 See {ref}`quick-start tutorial <tut_install>` for a step-by-step walk-through of the installation p…
25 $ ./config/examples/arch-ci-osx-dbg.py
30 your recommendations to <mailto:petsc-maint@mcs.anl.gov>. See bug report {ref}`documentation
34 - If you do not have a Fortran compiler or [MPICH](https://www.mpich.org/) installed
38 $ ./configure --with-cc=gcc --with-cxx=0 --with-fc=0 --download-f2cblaslapack --download-mpich
41 - Same as above - but install in a user specified (prefix) location.
44 …$ ./configure --prefix=/home/user/soft/petsc-install --with-cc=gcc --with-cxx=0 --with-fc=0 --down…
47 - If [BLAS/LAPACK], MPI sources (in "-devel" packages in most Linux distributions) are already
49 via `$PATH` - configure does not require any additional options.
55 - If [BLAS/LAPACK], MPI are already installed in known user location use:
[all …]
/petsc/
H A D.gitlab-ci.yml2 # stage-1 take only a few minutes; they do not run the full test suite or external packages.
4 # stage-2 runs on MCS systems and may take 10 to 15 minutes. They run the full test suite but with …
6 # stage-3 runs on MCS systems and may take an hour or more. They run the full test suite and heavil…
8 # The stage-(n) tests are only started if all of the stage-(n-1) tests run without error
13 - stage-1
14 - stage-2
15 - stage-3
16 - stage-4
19 GIT_CLEAN_FLAGS: -ffdxq
21 BASE_EXTRA_OPTIONS: -nox -nox_warning -malloc_dump
[all …]
/petsc/doc/changes/
H A D314.md6 - Deprecate PetscIgnoreErrorHandler(), use PetscReturnErrorHandler()
7 - Replace -debugger_nodes with -debugger_ranks
8 - Change PETSCABORT() to abort instead of MPI_Abort if run under
9 -start_in_debugger
10 - Add PETSC_MPI_THREAD_REQUIRED to control the requested threading
12 - Add CUDA-11 support, but with CUDA-11,
13 -mat_cusparse_storage_format {ELL, HYB} are not supported anymore.
15 - Add CUDA-11 option -mat_cusparse_spmv_alg {MV_ALG_DEFAULT,
18 - Add CUDA-11 option -mat_cusparse_spmm_alg {ALG_DEFAULT, CSR_ALG1
20 - Add CUDA-11 option -mat_cusparse_csr2csc_alg {ALG1 (default),
[all …]
/petsc/doc/faq/
H A Dindex.md22 …e: A Guide to Good Style](https://www.cambridge.org/core/books/writing-scientific-software/2320670…
32 - Fast, **low-latency** interconnect; any ethernet (even 10 GigE) simply cannot provide
34 - High per-core **memory** performance. Each core needs to
72 - [MPICH2 binding with the Hydra process manager](https://github.com/pmodels/mpich/blob/main/doc/wi…
75 $ mpiexec.hydra -n 4 --binding cpu:sockets
78 - [Open MPI binding](https://www.open-mpi.org/faq/?category=tuning#using-paffinity)
81 $ mpiexec -n 4 --map-by socket --bind-to socket --report-bindings
84 - `taskset`, part of the [util-linux](https://github.com/karelzak/util-linux) package
89 - `numactl`
92 policy. On Linux, the default policy is to attempt to find memory on the same memory bus
[all …]
/petsc/doc/
H A Dpetsc.bib3 % bibtool petsc.bib -- expand.macros=on -- print.line.length=100 -- pass.comments=on
17 % LiteralHTML: <a name="nano"><H3><center>Nano-simulations</center></H3>
19 @Misc{ semver-webpage,
25 title = {{D}-stability and {K}aps-{R}entrop-methods},
30 pages = {229--237},
41 pages = {93--113},
55 …title = {Generalized {R}unge--{K}utta methods of order four with stepsize control for sti…
59 pages = {55--68},
73 …gorithm for Structural Dynamics With Improved Numerical Dissipation: The Generalized-alpha Method},
77 pages = {371-375},
[all …]