In this page:


The remote visualization service at Cineca is provided through the Remote Connection Manager (RCM) application. Using this tool you can graphically inspect your data without moving them to your local work station.

System requirements

The “Remote Connection Manager”  has been tested and distributed for the following operating systems:

  1. Microsoft: Windows 7, 8, 10
  2. Linux : Ubuntu 16.04, Ubuntu 18.04, CentOS 7
  3. Apple: OSX Mojave

Other Linux distributions are untested.

Download

Download the application compatible with your operating system in the download page. On linux operating system you have to make the file executable with the command "chmod +x filename".

Getting started

Please consider that RCM is a client/server application: every time you interact with the application, server side has to perform some operations that can take some time depending on bandwidth and latency of your Internet connection and workload of the clusters.

Start the application double clicking on the application icon and insert the hostname  (see the table below) and hpc username or select a previous session  from the drop-down menu,  then enter the hpc password  "Password" and press “Login”.


ClusterHostname
GALILEO100rcm.g100.cineca.it
MARCONIrcm.marconi.cineca.it
M100rcm.m100.cineca.it


If the passwordless ssh key is enabled on your account/local machine,  leave "Password" field empty. 

It is possible to manage different RCM ssh sessions, one for each cluster: just click on "+"  button in the tab bar and insert the credentials of the new cluster:

Once logged in, a new tab will show you the list of available remote display sessions. If you have not created a display session yet, the list will be empty:

Working with displays

Create a new display

 To create a new display, click on the “+” button on the right:

A new window will appear to allow users to set some values that differ depending on the  cluster that  you are connected to:

  1. Scheduler: select SSH (no scheduler) or Slurm alternative. For more details see the web page UG2.7.1: Remote Visualisation
  2. Account: select one of your projects
  3. Queue: select the partition of Slurm (type of resources) where you want to run. On Galileo, e.g, you can choose the nodes provided with GPUs (gpuprod) or with only CPUs (prod). For more details see the web page UG2.7.1: Remote Visualisation
  4. Quality of Service (QoS): select one of the permitted qos. The selected QoS will affect the job in wall-time limit (priority) or in permissions.  On Galileo100, e.g., select the "dbg" qos for cpu partition if you need a higher priority to run and no more than two hours of elapsed time. For more details see the web page UG2.7.1: Remote Visualisation
  5. Memory: select the amount of memory (GB) to allocate for your job.
  6. Time: select the duration of your job. The maximum is 24 hours. If you selected "dbg" qos, you can request, e.g on Galileo100. up to 2 hours.
  7. GPU: select the number of GPUs if you have selected a GPU queue.
  8. Service Type: the service default is the latest version of VNC.
  9. Window Manager: the VNC session can start openbox or fluxbox as window manager. "Openbox GPU only" window manager is supported only on GPU nodes and it preloads the visualization stack (paraview, vtk, vtk). "Openbox/Fluxbox" window manager is supported on all nodes and in order to use visualization applications you have to load the specific module (e.g module load paraview). For more info place the tip of the cursor on a specific element of the proposed list

 

#########################################

RCM OLD SERVER >> For Marconi cluster the info to set are:

  1. Session name: insert a name for the new display
  2. Select queue: select the resources to use (number of cores, amount of memory, wall-time)
  3. Select wm+vnc: our suggestion is to select fluxbox as window manager for X

Pressing OK into "new display" window, a remote display session will be created and you will be automatically connected to it. A TurboVNC window will be open.

##########################################

This operation can take some time depending on the workload of the clusters. Now you can start your remote work session.

When a new display is created, a new item will be added to the list of available remote display sessions on the RCM tab, showing some display information that you will see below:

  1. Name: name of display session
  2. Status: the condition of the remote display session, pending (starting up the display session), valid (display session is running), killing (deleting the display session)
  3. Time: the remaining time before the session will ends (if the display session has no time limit, this value is replaced by the symbol "~" ). Note that each display session has a time limit: over that time limit, the display will be automatically killed and not saved data will be lost.
  4. Resources: the node of the cluster on which the remote display session has been created

You can update these values by clicking on the "REFRESH" button.

Share a display

Sharing a remote display means to give to another user the possibility to access to a specific remote display session you have created. The sharing of a remote display session is done by means of a .vnc file that as to be saved by the owner of the display session and opened by the user who has to access to the shared display session. To share a display session, click on the "SHARE THE REMOTE DISPLAY SESSION VIA FILE" button related to the remote display session you want to share.  A dialog will prompt the user to select a location for saving a file.
Send the saved file to the users who need to access to the shared display session. To connect to a shared display session click on the "Open" button from the "File" menu and select the received .vnc file. 

Kill a display

Display sessions can be killed by pressing the “KILL THE REMOTE DISPLAY SESSION” button. Just press the "x" button in the row associated with the remote display session you don’t want to use anymore, and it will be removed from the list of the available displays. This operation can take some time, depending on the workload of the clusters. Note that by pressing it, the relative display will be not reachable anymore and you will lose not saved data.

Running a visualization software

In order to execute a visualization software you have to load preventively the corresponding module:

module load paraview 
paraview

On GPUs resources (select "SSH" or "SLURM – >gll_usr_gpuprod partition") you don't have to just load the visualization software module, but also virtualgl module and launch the executable through vglrun. e.g. Paraview:

module load paraview 
module load virtualgl 
vglrun paraview

If you selected "Openbox GPU only" window manager when you created a new display you have to only execute the visualization software through vglrun command

vglrun paraview


  • No labels

1 Comment

  1. From the beginning of the page here:


    " The “Remote Connection Manager”  has been tested and distributed for the following operating systems:

    1. Microsoft: Windows 7, 8, 10
    2. Linux : Ubuntu 16.04, Ubuntu 18.04, CentOS 7
    3. Apple: OSX Mojave

    Other Linux distributions are untested. "


    It seems that the page is quite old. Probably to remove since now everyone use other OS.