Home
last modified time | relevance | path

Searched refs:server (Results 1 – 15 of 15) sorted by relevance

/petsc/config/BuildSystem/
H A DRDict.py368 def writeServerAddr(self, server): argument
371 pickle.dump(server.server_address, f)
540 self.server.rdict.lastAccess = time.time()
541 self.server.rdict.writeLogLine('SERVER: Started new handler')
544 value = self.server.rdict.recvPacket(self.rfile, source = 'SERVER')
546 … self.server.rdict.writeLogLine('SERVER: EOFError receiving packet '+str(e)+' '+str(e.__class__))
549 … self.server.rdict.writeLogLine('SERVER: Error receiving packet '+str(e)+' '+str(e.__class__))
550 self.server.rdict.sendPacket(self.wfile, e, source = 'SERVER')
554 response = getattr(self.server.rdict, value[0])(*value[1:])
556 … self.server.rdict.writeLogLine('SERVER: Error executing operation '+str(e)+' '+str(e.__class__))
[all …]
/petsc/doc/manual/
H A Dstreams.md278 ## Application with the MPI linear solver server
280 We now run the same PETSc application using the MPI linear solver server mode, set using `-mpi_line…
283 Note that it is far below the parallel solve without the server. However, the distribution time for…
285 …inter-process communication, especially in the matrix-vector product. In server mode, the vector i…
286 This indicates that a naive use of the MPI linear solver server will not produce as much performanc…
291 server processes. Unfortunately, `MPI_Scatterv()` does not scale with more MPI processes; hence, th…
293 from which all the MPI processes in the server
295 There is still a (now much smaller) server processing overhead since the initial data storage of th…
300 :alt: GAMG server speedup
303 GAMG server speedup
[all …]
H A Dabout_this_manual.md38 utilize the PETSc MPI linear solver server.
H A Dksp.md2632 Using PETSc's MPI linear solver server it is possible to use multiple MPI processes to solve a
2647 … systems solved by the MPI linear solver server when the program completes. By default the linear …
2650 to cause the linear solver server to allow as few as 5,000 unknowns per MPI process in the parallel…
2654 For help when anything goes wrong with the MPI linear solver server see `PCMPIServerBegin()`.
H A Dother.md367 The Scientific Application Web server, SAWs [^saws], allows one to monitor
/petsc/src/ksp/ksp/tutorials/output/
H A Dex88f_2.out41 Running MPI linear solver server directly on rank 0 due to its small size
91 Running MPI linear solver server directly on rank 0 due to its small size
101 MPI linear solver server statistics:
H A Dex89f_2.out41 Running MPI linear solver server directly on rank 0 due to its small size
91 Running MPI linear solver server directly on rank 0 due to its small size
101 MPI linear solver server statistics:
H A Dex1_mpi_linear_solver_server_1_shared_memory_false.out134 MPI linear solver server statistics:
137 MPI linear solver server not using shared memory
H A Dex1_mpi_linear_solver_server_1.out134 MPI linear solver server statistics:
H A Dex88f_1.out143 MPI linear solver server statistics:
H A Dex89f_1.out143 MPI linear solver server statistics:
/petsc/doc/miscellaneous/
H A Dsaws.md5 the server (PETSc) application).
/petsc/doc/install/
H A Ddownload.md47 - [Primary server](https://web.cels.anl.gov/projects/petsc/download/release-snapshots/)
/petsc/share/petsc/datafiles/meshes/
H A Dtestcase3D.cas8560 (export/endvs/start-server? #t)
8561 (export/endvs/verbose-server-full? #f)
8567 (export/endvs/nodes-per-server 0)
/petsc/doc/
H A Dpetsc.bib15726 …\url{https://inlportal.inl.gov/portal/server.pt/community/center_for_materials_science_of_nuclear_…