EURORA has been taken out of production in August 2014

 

We report here the measures of latencies and bandwidths of MPI messages with and without the Infiniband support on Eurora cluster (equipped with the Intel Xeon Phi coprocessors).

We employed the Intel MPI Benchmarks suite (4.0), with the Intel Cluster Studio XE 2013 SP1 (1.106) compilers and IntelMPI v4.2.1.040. On Euora cluster we loaded the following modules:

module load profile/advanced
module load intel/cs-xe-2013--binary
module load intelmpi/4.1.2--binary

and we set the environment as follows:

source $INTEL_HOME/bin/compilervars.sh intel64 
export I_MPI_MIC=enable
export I_MPI_DAPL_PROVIDER_LIST=ofa-v2-mlx4_0-1,ofa-v2-scif0,ofa-v2-mlx4_0-1
export IMBHOST=/cineca/prod/compilers/intel/cs-xe-2013/binary/impi/4.1.2.040/bin64/IMB-MPI1
export IMBMIC=/cineca/prod/compilers/intel/cs-xe-2013/binary/impi/4.1.2.040/mic/bin/IMB-MPI1

The benchmarks which were executed are:

mpirun.mic -np 1 -host $host1 -genv I_MPI_FABRICS=tcp $IMBHOST PingPong : -np 1 -host $host2 $IMBHOST > cpu_remote-cpu
mpirun.mic -np 1 -host $host1 -genv I_MPI_FABRICS=tcp $IMBHOST PingPong : -np 1 -host $host1-mic0 $IMBMIC > cpu_mic0
mpirun.mic -np 1 -host $host1 -genv I_MPI_FABRICS=tcp $IMBHOST PingPong : -np 1 -host $host1-mic1 $IMBMIC > cpu_mic1
mpirun.mic -np 1 -host $host1-mic0 -genv I_MPI_FABRICS=tcp $IMBMIC PingPong : -np 1 -host $host1-mic1 $IMBMIC > mic0_mic1
mpirun.mic -np 1 -host $host1 -genv I_MPI_FABRICS=tcp $IMBHOST PingPong : -np 1 -host $host2-mic0 $IMBMIC > cpu_remote-mic0
mpirun.mic -np 1 -host $host1 -genv I_MPI_FABRICS=tcp $IMBHOST PingPong : -np 1 -host $host2-mic1 $IMBMIC > cpu_remote-mic1
mpirun.mic -np 1 -host $host1-mic0 -genv I_MPI_FABRICS=tcp $IMBMIC PingPong : -np 1 -host $host2-mic0 $IMBMIC > mic0_remote-mic0
mpirun.mic -np 1 -host $host1-mic0 -genv I_MPI_FABRICS=tcp $IMBMIC PingPong : -np 1 -host $host2-mic1 $IMBMIC > mic0_remote-mic1
mpirun.mic -np 1 -host $host1 -genv I_MPI_FABRICS=dapl $IMBHOST PingPong : -np 1 -host $host2 $IMBHOST > dapl_cpu_remote-cpu
mpirun.mic -np 1 -host $host1 -genv I_MPI_FABRICS=dapl $IMBHOST PingPong : -np 1 -host $host1-mic0 $IMBMIC > dapl_cpu_mic0
mpirun.mic -np 1 -host $host1 -genv I_MPI_FABRICS=dapl $IMBHOST PingPong : -np 1 -host $host1-mic1 $IMBMIC > dapl_cpu_mic1
mpirun.mic -np 1 -host $host1 -genv I_MPI_FABRICS=dapl $IMBHOST PingPong : -np 1 -host $host2-mic0 $IMBMIC > dapl_cpu_remote-mic0
mpirun.mic -np 1 -host $host1 -genv I_MPI_FABRICS=dapl $IMBHOST PingPong : -np 1 -host $host2-mic1 $IMBMIC > dapl_cpu_remote-mic1
mpirun.mic -np 1 -host $host1-mic0 -genv I_MPI_FABRICS=dapl $IMBMIC PingPong : -np 1 -host $host2-mic1 $IMBMIC > dapl_mic0_mic1
mpirun.mic -np 1 -host $host1-mic0 -genv I_MPI_FABRICS=dapl $IMBMIC PingPong : -np 1 -host $host2-mic0 $IMBMIC > dapl_mic0_remote-mic0
mpirun.mic -np 1 -host $host1-mic0 -genv I_MPI_FABRICS=dapl $IMBMIC PingPong : -np 1 -host $host2-mic1 $IMBMIC > dapl_mic0_remote-mic1
mpirun.mic -np 1 -host $host1-mic1 -genv I_MPI_FABRICS=dapl $IMBMIC PingPong : -np 1 -host $host2-mic1 $IMBMIC > dapl_mic1_remote-mic1
mpirun.mic -np 1 -host $host1-mic0 -genv I_MPI_FABRICS=dapl $IMBMIC PingPong : -np 1 -host $host1-mic0 $IMBMIC > dapl_mic0_mic0
mpirun.mic -np 1 -host $host1-mic0 -genv I_MPI_FABRICS=shm $IMBMIC PingPong : -np 1 -host $host1-mic0 $IMBMIC > shm_mic0_mic0

 

 

© Copyright 2

 

  • No labels