Home
last modified time | relevance | path

Searched refs:running (Results 1 – 25 of 55) sorted by relevance

123

/petsc/src/sys/tutorials/output/
H A Dex7_0.out3 [0:time_removed:PetscLogHandlerEventBegin_Ex7 ] Event "Event 1" started: now running 1 times
5 [0:time_removed:PetscLogHandlerEventBegin_Ex7 ] Event "Event 2" started: now running 1 times
7 [0:time_removed:PetscLogHandlerEventBegin_Ex7 ] Event "Event 1" started: now running 2 times
9 [0:time_removed:PetscLogHandlerEventEnd_Ex7 ] Event "Event 1" stopped: now running 1 times
10 [0:time_removed:PetscLogHandlerEventEnd_Ex7 ] Event "Event 2" stopped: now running 0 times
12 [0:time_removed:PetscLogHandlerEventEnd_Ex7 ] Event "Event 1" stopped: now running 0 times
/petsc/src/sys/tutorials/
H A Dex7.c24 PetscHMapI running; member
36 PetscCall(PetscHMapICreate(&ctx->running)); in HandlerCtxCreate()
47 PetscCall(PetscHMapIDestroy(&ctx->running)); in HandlerCtxDestroy()
76 PetscCall(PetscHMapIGetWithDefault(ctx->running, e, 0, &count)); in PetscLogHandlerEventBegin_Ex7()
80 PetscCall(PetscHMapISet(ctx->running, e, count)); in PetscLogHandlerEventBegin_Ex7()
94 PetscCall(PetscHMapIGetWithDefault(ctx->running, e, 0, &count)); in PetscLogHandlerEventEnd_Ex7()
98 PetscCall(PetscHMapISet(ctx->running, e, count)); in PetscLogHandlerEventEnd_Ex7()
190 PetscCall(PetscHMapIGetSize(ctx->running, &num_entries)); in PetscLogHandlerView_Ex7()
/petsc/src/ksp/ksp/tests/output/
H A Dex81_1.out39 …INFOG(36) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - val…
40 …INFOG(37) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - sum…
41 …INFOG(38) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core -…
42 …INFOG(39) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core -…
88 …INFOG(36) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - val…
89 …INFOG(37) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - sum…
90 …INFOG(38) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core -…
91 …INFOG(39) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core -…
H A Dex81_2.out51 …INFOG(36) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - val…
52 …INFOG(37) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - sum…
53 …INFOG(38) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core -…
54 …INFOG(39) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core -…
119 …INFOG(36) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - val…
120 …INFOG(37) (after analysis: estimated size of all MUMPS internal data for running BLR in-core - sum…
121 …INFOG(38) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core -…
122 …INFOG(39) (after analysis: estimated size of all MUMPS internal data for running BLR out-of-core -…
/petsc/doc/manualpages/MANSECHeaders/
H A DBM3 Benchmark (`BM`) objects manage running PETSc benchmarks
/petsc/src/ksp/ksp/tutorials/output/
H A Dbench_kspsolve_matmult.out10 Step2 - running MatMult() 10 times...
H A Dbench_kspsolve_ksp.out9 Step2 - running KSPSolve()...
/petsc/src/snes/tutorials/
H A Dbuild.zig6 // Standard target options allows the person running `zig build` to choose
12 // Standard release options allow the person running `zig build` to select
/petsc/src/binding/petsc4py/
H A Dtox.ini1 # Tox (http://tox.testrun.org/) is a tool for running tests
/petsc/doc/developers/contributing/
H A Dpipelines.md8 realize a currently running pipeline is no longer needed, cancel it.
51 A failure in running the job's tests will have `FAILED` and a list of the failed tests
112 Error in running configure.
/petsc/src/snes/tutorials/output/
H A Dex19_failure_size.out3 … in your code or you must ./configure PETSc with --with-64-bit-indices for the case you are running
/petsc/doc/miscellaneous/
H A Dsaws.md18 `http://localhost:8080` or, if you are running your application on a
H A Dcodemanagement.md28 different directories for editing, compiling, and running.
95 for compiling and running is much faster than for the parallel
/petsc/doc/developers/
H A Ddevelopment.md15 doing so locally will save you the pain of re-running it.
H A Dtesting.md345 to prevent it from being output when the CI test harness is running.
510 The make rules for running tests are contained in `gmakefile.test` in the PETSc root directory. The…
525 The running of the test harness will show which tests fail, but you may not have
563 Consider this line from running the PETSc test system:
574 - The shell script running the test is located at: `$PETSC_DIR/$PETSC_ARCH/tests/vec/is/sf/tests/ru…
630 command. It will print a command suitable for running from
648 NO_RM=1 Do not remove the executables after running
708 for the normal compile and edit, running the entire harness with search can be
917 The total number of tests for running only ctetgen or triangle is 500. They have 22 tests in common…
/petsc/doc/manual/
H A Dtests.md8 `$PETSC_DIR` and `$PETSC_ARCH` before running the tests, or can
47 Depending on your machine’s configuration running the full test suite
49 currently we do not have a mechanism for automatically running the test
H A Dmatlab.md10 a running PETSc program to a MATLAB process where you may
90 machine if the MATLAB interactive session is running on the same machine
189 If you are running PETSc on a cluster (or machine) that does not have a license for MATLAB, you mig…
H A Dblas-lapack.md35 …an MPI size of 16 then each core is assigned an MPI process. But if the BLAS/LAPACK is running with
46 how one can restrict the number of MPI processes while running MUMPS to utilize parallel BLAS/LAPAC…
H A Dperformance.md60 Consequently, running with more than 8 MPI ranks on such a system will
90 bandwidth-bound PETSc applications of at most 4x when running multiple
92 running with only 4-6 ranks. Because a smaller number of MPI ranks
110 memory is accessed in a non-uniform way: A process running on one socket
134 CPU the process is running on. Only if all memory on the respective CPU
565running optimized code. Using `-malloc_debug` or `-malloc_test` for large runs can slow them signi…
593 - When running with `-log_view`, the additional option `-log_view_memory`
694 - **Effects of other users**: If other users are running jobs on the
702 time required to get the current time in a running program. On good
H A Dother.md258 - Vector and matrix objects can be passed to a running MATLAB process
368 running PETSc applications from a browser. To use SAWs you must `configure` PETSc with
417 xterm (Apple Terminal on macOS), to enable running separate debuggers on each process, unless the
470 running process if an error is detected. At runtime, these error
476 hanging without running in the debugger) with the option
831 done. This is useful for running large jobs when the graphics overhead
888 …or PETSc can be generated by running `make allgtags` from `$PETSC_DIR`, or one can generate tags f…
949 exists. If this file is not present, it should be generated by running `make
1154 PETSC_INCLUDE = path/to/petsc_includes # e.g. obtained via running `make getincludedirs'
1155 PETSC_LIBS = path/to/petsc_libs # e.g. obtained via running `make getlinklibs'
/petsc/doc/tutorials/performance/
H A Dguide_to_TAS.md6 Below is the guide to running TAS using ex13, which is a Poisson Problem in 2D and 3D with Finite E…
/petsc/src/binding/petsc4py/docs/source/
H A Dpetsc_options.rst21 Then one can provide command-line options when running a script:
/petsc/config/BuildSystem/
H A DRDict.py398 you can fix this problem by running /bin/rebaseall. If you do not have\n \
402 are running. For more information about rebase, go to http://www.cygwin.com')
405 you can fix this problem by running /bin/rebaseall. If you do not have\n \
409 are running. For more information about rebase, go to http://www.cygwin.com\n')
/petsc/doc/install/
H A Dmultibuild.md95 running `configure`. The only thing preserved during this process is the
113 For example running the following `configure`:
/petsc/lib/petsc/conf/
H A Dvariables110 # The following include file is created when running ./configure

123