In this page:
Additional page:

Remote visualization service

Remote Visualization has become a fundamental requirement for users who need to:

  • visualize the data produced on our HPC systems (scientific visualization);
  • analyze and inspect data directly on the systems;
  • debug and profile parallel codes running onto HPC clusters.

All the aforementioned categories can take advantage of launching the applications on the server side. For instance, analyzing a large amount of data in situ avoids the transfer of GBs or TBs of data.

Debugging and profiling tools have to be interfaced to the compute nodes which execute the parallel code;  they can benefit from tools enabling a graphic connection to the compute nodes. Scientific visualization can exploit the hardware (GPUs, memory and CPUs) available on the server side, enabling the user to remotely access their data and display them in an efficient way on their local client.

In order to use this service  you need to have a valid username and budget on the cluster where it is available.

Remote visualization resources

The remote visualization service is usable on different Cineca's clusters and on each one different resources and access ways are available.


RESOURCESRESOURCES ACCESS RESOURCES LIMITSACTIVITY TYPEFREE USE
GALILEO100
SSH (no scheduler)

cpu: 1,
mem: 3gb
walltime: 10 min of cpu time

light graphicsYes


SLURM scheduler

g100_usr_prod:
cpus: up to 48, mem: up to 3T, walltime: qos dependent
- g100_qos_dbg 2h (higher priority)
- noQOS 24h

strong graphics and simulation activity (no gpu)

No

(charged to your account depending on time and requested cpus)





g100_usr_interactive:
cpus: 48, gpus: 2, mem: 366GB, walltime 24h

strong graphics and simulation activity using gpus

No

(charged to your account depending on time and requested cpus)

MARCONIIntel Xeon 8160 CPUs (Skylake)SLURM scheduler

skl_usr_dbg:

cpu: 48

mem: 177gb

walltime: 30 min

graphics and simulation testing activity (walltime of 2 h, shorter waiting time)                      Yes



skl_usr_prod

cpu: 48
mem: 177gb
walltime: 24 h

strong graphics and simulation activityYes
M100

IBM POWER9 AC922 CPUs

Volta V100 GPUs

SSH (no scheduler)

cpu: 1

gpu: 1

mem: 7GB

walltime: 10 min of cpu time

light graphics using gpu                     Yes


SLURM scheduler

m100_usr_prod 

CPUs: up to 32 (128 virtual cpus), GPUs: up to 4, MEM: up to 240GB, WALLTIME: qos dependent
- m100_qos_dbg 2h (higher priority)
- noQOS 24h

strong graphics and simulation activity

No

(charged to your account depending on time and requested cpus/gpus/mem)

Remote Connection Manager (RCM)

The remote visualization service at Cineca is provided through Remote Connection Manager (RCM) application. Using this tool you can graphically inspect your data without moving them to your local work station.

 It can be used by any user with valid credentials to access CINECA clusters. If you are interested in using it see this web page.
  • No labels

1 Comment

  1. Graphic Sessions using VNC Client from Windows OS

    Users working on MS Windows systems may setup a graphical session on our remote login nodes usingPutty (an SSH client) and an TurboVNC (an VNC client), both freely available for Windows OS. Here are the steps to configure and start a graphic session.

    Requirements

    Setup

    Use Putty to start a simple and standard SSH session on our login nodes. Be sure to have, or create, the following file on your remote $HOME/.vnc/xstartup:

    #!/bin/sh
    unset SESSION_MANAGER
    unset DBUS_SESSION_BUS_ADDRESS
    /etc/X11/xinit/xinitrc
    startfluxbox

    Start a Graphic Session

    Start a vncserver on the remote login node you're connected with SSH. For example here follows the command to create a VNC graphical session with a display layout geometry of 1200x1024
    $> vncserver -geometry 1200x1024

    Take note of the display number created by the vncserver. In this example, vncserver created a server on display number 3:

    VNC server usually open connection on a port number equal to 5900 + <display number>. If the vncserver has created the display number 3, that means that it will wait for connection to port 5903.

    Now, click with your right mouse button on the Putty GUI and select:
    Change Settings -> Connections -> SSH -> Tunnel
    and Add a local port 5903 and a remote port expressed as <remote-login-IP>:5903
    then click on the Apply button.

    You are now ready to launch your VNC client and connect to your local port using: localhost:3

    WARNING: vncserver will remain opened, even if you close the VNC client viewer. You need to kill the server explicitly, connecting to the login node with SSH (use Putty), select the vnc server display you want to kill:

    1. vncserver -list # list all VNC opened displays
    2. vncserver kill :<N> # kill Nth display

    Remember to kill your opened vnc server displays if you don't need them, otherwise precious resources are wasted.