ParaView/Run on Remote Machine
Here are instructions for running ParaView on some selected remote machines.
In general, this involves:
- Launching an interactive job using the machine's job scheduler
- Loading whatever required software is needed on the session
- Launching the Paraview server on the session
- Connecting to the remote Paraview server through a local Paraview instance
Contents
Current Machines
Cooley (ALCF)
Connect to Cooley
- From the command line on the Viz Nodes,
ssh username@cooley.alcf.anl.gov
- Enter the Mobile Pass + passcode from your phone
Submit Interactive Job
There are multiple versions of interactive submission scripts used by the group, but most take in two inputs, number of nodes and run time. It is recommended that you check the contents of any given submitInteractive.sh
script you receive to look for the argument order. As an example though, the scripts being used for Gust AFOSR work (at /projects/PHASTA_aesp/Models/GustWing/OTS/PastRuns
is used as: ./submitInteractive.sh <runtime in minutes> <nodes>
.
Once the interactive job is started, you will need to start a ParaView server using a pvserverLaunch.<version>.sh
script. Note that the version of the server needs to match the version you will use on the Viz Nodes. ParaView 5.5.2 is the most recent version fully supported on both the Viz Nodes and Cooley. The script available at he same directory as above takes in the number of processes per node as an input, which to use all of Cooley's resources should be 12, which is ran as: ./pvserverLaunch.5.5.2.sh 12
.
Once the server is running and "Waiting for client..." you should launch ParaView on the Viz Nodes and select the connect to server icon (just to the right of the open file icon at the top of the screen). From here, using the listed cc***
number, you will need to configure the server connection if you do not have one configured for the cc***
number. This is done by selecting "Add Server" and then populating the "Edit Server Configuration" page. The name of the server should be the cc***
number, the host field will be cc***.cooley.pub.alcf.anl.gov<code> and the port is <code>8000
. From here you can select configure and then connect.
If, upon connection, you need to load the SyncIO remote plugin, these can be found for multiple versions of ParaView at /lus/theta-fs0/projects/PHASTA_aesp/ParaView/ParaViewSyncIOReaderPlugin
.
Retired Machines
Eureka (ALCF)
Connect to Eureka
- From the command line on the Colorado machine,
ssh username@eureka.alcf.anl.gov
- Enter 4 digit pin, followed by the number on the CRYPTOCard
Submit Interactive Job
- From the command line on Eureka,
cp /home/jmartin/qsub_interactive_command.sh
to your home directory. Openqsub_interactive_command.sh
and change the total time you want to run paraview (time), and the allocation your account is under (account).
./qsub_interactive_command.sh nodes
where nodes is the total number of nodes you want. Each node has 8 cores, and 32 GB of memory.
Janus (CUBoulder Research Computing)
Video Tutorial (use the paths below, NOT the ones in the Video) http://fluid.colorado.edu/~matthb2/janus/pv_on_janus.html
Running the UI on Portal0
soft add @paraview-3.8.0 soft add +paraview-3.8.0-gnu-ompi-covis vglrun paraview
and then start and connect to the server:
Starting the Server on Janus
. /projects/jansenke/matthb2/env-gnu.sh qsub -q janus-debug /projects/jansenke/matthb2/pvserver-gnu_runscript-sysgl.sh
and use checkjob or look at the output files to figure out which node your rank0 is on and connect ParaView to that (or change it to use reverse connections if you prefer).
Tukey (ALCF)
ParaView GUI running on portal0 at Colorado
Video Tutorial about how to run a pvserver-syncio in parallel on the Tukey visualization nodes and connect the pvserver to a ParaView Gui running on portal0 at Colorado
http://fluid.colorado.edu/~mrasquin/Documents_HIDE/Tukey/ParaviewOnTukeyFromPortal0/index.html
This video can be copied from /users/mrasquin/public_html/Documents_HIDE/Tukey/ParaviewOnTukeyFromPortal0 on the viz nodes.
ParaView GUI running on the Tukey login node
Video Tutorial about how to run a pvserver-syncio in parallel on the Tukey visualization nodes and connect the pvserver to a ParaView Gui running on the Tukey login node
https://fluid.colorado.edu/~mrasquin/phasta/ParaViewOnTukey/index.html
This video can be copied from /users/mrasquin/public_html/Tukey/ParaviewOnTukeyThroughVNC on the viz nodes.
Note that because vncserver on the Tukey head node does not support OpenGL, this method does not allow the export of png pictures from the ParaView Gui. Indeed, the result will be completely fuzzy. The first method is therefore strongly recommended.