...
- if you want to transfer files FROM/TO your $HOME directory you have to specify the absolute path., the same is true for your $WORK, $CINECA_SCRATCH filesystems
- You cannot use the SSH configuration stored in your remote ~/.ssh/ directory . (~/.ssh/config).
Listing Directory via sftp
If you need to list files on a cluster where login nodes are offline, you can rely on datamover service via the sftp command:
$ sftp <username>@data.<cluster_name>.cineca.it:/path/to/be/listed/
Connected to data.<cluster_name>.cineca.it
Changing to: /path/to/be/listed/
sftp>
One entered the sftp session, the familiar pwd, cd /path/to/, ls commads are available to explore the remote filesystem, together with the sftp command lpwd, lcd /path/to/, lls. You can also transfer data from the sftp session, see the appropriate section below.
At present M100 login nodes are offline, so you need to resort to the sftp command to explore M100 filesystem. For instance:
$ sftp fcola000@data.m100.cineca.it:/m100_scratch/userinternal/fcola000/
Connected to data.m100.cineca.it.
Changing to: /m100_scratch/userinternal/fcola000/
sftp> cd cuda
sftp> ls -l
drwxr-xr-x 3 fcola000 interactive 4096 Feb 11 2021 targets
sftp> pwd
Remote working directory: /m100_scratch/userinternal/fcola000/cuda
sftp>
Available transfer tools
rsync
...
- You need to upload or download data FROM/TO your local machine TO/FROM a CINECA HPC cluster
$ sftp <username>@data.<cluster_name>.cineca.it:/absolute/remote/path/to/
sftp> put relative/local/path/to/file
Uploading /absolute/local/path/to/file to /absolute/remote/path/to/file
file 100% 414 365.7KB/s 00:00
sftp> get relative/remote/path/to/file - You need to transfer files between 2 CINECA HPC clusters
$ ssh -xt <username>data.<cluster_name_1>.cineca.it sftp <username>@data.<cluster_name_2>.cineca.it:/absolute/path/to/
...