Difference between revisions of "ParaView/Run on Remote Machine"
Line 52: | Line 52: | ||
Note that because vncserver on the Tukey head node does not support OpenGL, this method does not allow the export of png pictures from the ParaView Gui. Indeed, the result will be completely fuzzy. The first method is therefore strongly recommended. | Note that because vncserver on the Tukey head node does not support OpenGL, this method does not allow the export of png pictures from the ParaView Gui. Indeed, the result will be completely fuzzy. The first method is therefore strongly recommended. | ||
+ | |||
+ | [[Category:Paraview]] |
Revision as of 16:11, 13 July 2020
Here are instructions for running Paraview on some selected remote machines.
In general, this involves:
- Launching an interactive job using the machine's job scheduler
- Loading whatever required software is needed on the session
- Launching the Paraview server on the session
- Connecting to the remote Paraview server through a local Paraview instance
Contents
Retired Machines
Eureka (ALCF)
Connect to Eureka
- From the command line on the Colorado machine,
ssh username@eureka.alcf.anl.gov
- Enter 4 digit pin, followed by the number on the CRYPTOCard
Submit Interactive Job
- From the command line on Eureka,
cp /home/jmartin/qsub_interactive_command.sh
to your home directory. Openqsub_interactive_command.sh
and change the total time you want to run paraview (time), and the allocation your account is under (account).
./qsub_interactive_command.sh nodes
where nodes is the total number of nodes you want. Each node has 8 cores, and 32 GB of memory.
Janus (CUBoulder Research Computing)
Video Tutorial (use the paths below, NOT the ones in the Video) http://fluid.colorado.edu/~matthb2/janus/pv_on_janus.html
Running the UI on Portal0
soft add @paraview-3.8.0 soft add +paraview-3.8.0-gnu-ompi-covis vglrun paraview
and then start and connect to the server:
Starting the Server on Janus
. /projects/jansenke/matthb2/env-gnu.sh qsub -q janus-debug /projects/jansenke/matthb2/pvserver-gnu_runscript-sysgl.sh
and use checkjob or look at the output files to figure out which node your rank0 is on and connect ParaView to that (or change it to use reverse connections if you prefer).
Tukey (ALCF)
ParaView GUI running on portal0 at Colorado
Video Tutorial about how to run a pvserver-syncio in parallel on the Tukey visualization nodes and connect the pvserver to a ParaView Gui running on portal0 at Colorado
http://fluid.colorado.edu/~mrasquin/Documents_HIDE/Tukey/ParaviewOnTukeyFromPortal0/index.html
This video can be copied from /users/mrasquin/public_html/Documents_HIDE/Tukey/ParaviewOnTukeyFromPortal0 on the viz nodes.
ParaView GUI running on the Tukey login node
Video Tutorial about how to run a pvserver-syncio in parallel on the Tukey visualization nodes and connect the pvserver to a ParaView Gui running on the Tukey login node
https://fluid.colorado.edu/~mrasquin/phasta/ParaViewOnTukey/index.html
This video can be copied from /users/mrasquin/public_html/Tukey/ParaviewOnTukeyThroughVNC on the viz nodes.
Note that because vncserver on the Tukey head node does not support OpenGL, this method does not allow the export of png pictures from the ParaView Gui. Indeed, the result will be completely fuzzy. The first method is therefore strongly recommended.