This post explains how I installed the Intel MPI Benchmarks and what errors I got on Fedora 40. Ubuntu 22.04 installation is straightforward, please see this one if Ubuntu also works for you (just execute the given commands) in this link
In short, you need to do the normal installation but then you need to create 2 symbolic links and make a very small change in the code to remove identifier `register`
I cloned this repository first = https://github.com/intel/mpi-benchmarks then I did make
to compile. This page shows the steps before make works properly. Ok, let’s get it started:
I installed the kit in this link = https://www.intel.com/content/www/us/en/developer/tools/oneapi/hpc-toolkit-download.html?operatingsystem=linux&linux-install-type=dnf with these commands:
tee > /tmp/oneAPI.repo << EOF
[oneAPI]
name=Intel® oneAPI repository
baseurl=https://yum.repos.intel.com/oneapi
enabled=1
gpgcheck=1
repo_gpgcheck=1
gpgkey=https://yum.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
EOF
sudo mv /tmp/oneAPI.repo /etc/yum.repos.d
dnf install intel-hpckit
This didn’t work and this error was raised:
make -C src_cpp -f Makefile TARGET=MPI1
make -C src_cpp -f Makefile TARGET=NBC
make -C src_cpp -f Makefile TARGET=RMA
make -C src_cpp -f Makefile TARGET=EXT
make[1]: Entering directory '/home/afeser/mpi-benchmarks/src_cpp'
make[1]: Entering directory '/home/afeser/mpi-benchmarks/src_cpp'
make[1]: Entering directory '/home/afeser/mpi-benchmarks/src_cpp'
make[1]: Entering directory '/home/afeser/mpi-benchmarks/src_cpp'
mpiicpc -Ihelpers -I../src_c -DMPI1 -I. -g -O0 -Wall -Wextra -pedantic -Wno-long-long -c -o imb.o imb.cpp
mpiicpc -Ihelpers -I../src_c -DEXT -I. -g -O0 -Wall -Wextra -pedantic -Wno-long-long -c -o imb.o imb.cpp
mpiicpc -Ihelpers -I../src_c -DNBC -I. -g -O0 -Wall -Wextra -pedantic -Wno-long-long -c -o imb.o imb.cpp
mpiicpc -Ihelpers -I../src_c -DRMA -I. -g -O0 -Wall -Wextra -pedantic -Wno-long-long -c -o imb.o imb.cpp
/opt/intel/oneapi/mpi/2021.13/bin/mpiicpx: line 559: icpc: command not found
/opt/intel/oneapi/mpi/2021.13/bin/mpiicpx: line 559: icpc: command not found
/opt/intel/oneapi/mpi/2021.13/bin/mpiicpx: line 559: icpc: command not found
make[1]: *** [Makefile:169: imb.o] Error 127
make[1]: *** [Makefile:169: imb.o] Error 127
make[1]: Leaving directory '/home/afeser/mpi-benchmarks/src_cpp'
make[1]: Leaving directory '/home/afeser/mpi-benchmarks/src_cpp'
/opt/intel/oneapi/mpi/2021.13/bin/mpiicpx: line 559: icpc: command not found
make[1]: *** [Makefile:169: imb.o] Error 127
make[1]: Leaving directory '/home/afeser/mpi-benchmarks/src_cpp'
make: *** [Makefile:44: IMB-EXT] Error 2
mpiicpc -Ihelpers -I../src_c -DNBC -I. -g -O0 -Wall -Wextra -pedantic -Wno-long-long -c -o args_parser.o args_parser.cpp
make: *** Waiting for unfinished jobs....
make: *** [Makefile:36: IMB-MPI1] Error 2
make: *** [Makefile:48: IMB-RMA] Error 2
make[1]: *** [Makefile:169: imb.o] Error 127
make[1]: *** Waiting for unfinished jobs....
/opt/intel/oneapi/mpi/2021.13/bin/mpiicpx: line 559: icpc: command not found
make[1]: *** [Makefile:169: args_parser.o] Error 127
make[1]: Leaving directory '/home/afeser/mpi-benchmarks/src_cpp'
make: *** [Makefile:40: IMB-NBC] Error 2
Then, I tried the online installer, but it gave the same error. Then, I removed everything from /opt/intel
using the intel’s uninstallation method (just run the installer again). And I started installing the packages that might include icpc. I looked for the compilers and I tried these:
# unfortunately dnf provides does not work
sudo dnf install intel-dpcpp-cpp-compiler-2024.2
sudo dnf install intel-oneapi-compiler-dpcpp-cpp-runtime
sudo dnf install intel-oneapi-compiler-dpcpp-cpp-and-cpp-classic # this one works!
The last one made it. Then, I tried compiling again, and got a very long output. Putting the first few lines here (lots of undefined types follow):
make -C src_cpp -f Makefile TARGET=MPI1
make -C src_cpp -f Makefile TARGET=NBC
make -C src_cpp -f Makefile TARGET=RMA
make -C src_cpp -f Makefile TARGET=EXT
make[1]: Entering directory '/home/afeser/mpi-benchmarks/src_cpp'
make[1]: Entering directory '/home/afeser/mpi-benchmarks/src_cpp'
make[1]: Entering directory '/home/afeser/mpi-benchmarks/src_cpp'
make[1]: Entering directory '/home/afeser/mpi-benchmarks/src_cpp'
mpiicpc -Ihelpers -I../src_c -DNBC -I. -g -O0 -Wall -Wextra -pedantic -Wno-long-long -c -o imb.o imb.cpp
mpiicpc -Ihelpers -I../src_c -DRMA -I. -g -O0 -Wall -Wextra -pedantic -Wno-long-long -c -o imb.o imb.cpp
mpiicpc -Ihelpers -I../src_c -DMPI1 -I. -g -O0 -Wall -Wextra -pedantic -Wno-long-long -c -o imb.o imb.cpp
mpiicpc -Ihelpers -I../src_c -DEXT -I. -g -O0 -Wall -Wextra -pedantic -Wno-long-long -c -o imb.o imb.cpp
icpc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
icpc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
icpc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
icpc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
In file included from /usr/include/c++/14/cwchar(44),
In file included from /usr/include/c++/14/cwchar(44),
from /usr/include/c++/14/bits/postypes.h(40),
In file included from /usr/include/c++/14/cwchar(44),
from /usr/include/c++/14/bits/char_traits.h(42),
from /usr/include/c++/14/string(42),
In file included from /usr/include/c++/14/cwchar(44),
from /usr/include/c++/14/stdexcept(39),
from /usr/include/c++/14/bits/postypes.h(40),
from imb.cpp(34):
from /usr/include/c++/14/bits/postypes.h(40),
from /usr/include/c++/14/bits/char_traits.h(42),
from /usr/include/c++/14/bits/char_traits.h(42),
from /usr/include/c++/14/string(42),
from /usr/include/c++/14/stdexcept(39),
from /usr/include/c++/14/string(42),
from imb.cpp(34):
from /usr/include/c++/14/stdexcept(39),
from imb.cpp(34):
from /usr/include/c++/14/bits/postypes.h(40),
from /usr/include/c++/14/bits/char_traits.h(42),
from /usr/include/c++/14/string(42),
from /usr/include/c++/14/stdexcept(39),
from imb.cpp(34):
/usr/include/wchar.h(422): error: identifier "_Float32" is undefined
extern _Float32 wcstof32 (const wchar_t *__restrict __nptr,
^
And this makes me think that MPI libraries are so old (from 2021) and therefore they cannot use icpx which replaces icpc. https://www.intel.com/content/www/us/en/developer/articles/release-notes/mpi-library-release-notes.html
I realized someone had a similar problem here https://community.intel.com/t5/Intel-MPI-Library/Does-icpx-has-backward-compatibility-for-icpc/m-p/1248814/constants and they said ipcx is almost backward compatible with icpc. So, I’ll try creating a link. But, ipcx is also not available at this point, so I autoremoved and installed the toolkit again, but this time I also include the base kit given here https://www.intel.com/content/www/us/en/developer/tools/oneapi/hpc-toolkit-download.html?operatingsystem=linux&linux-install-type=apt
dnf autoremove intel-hpckit -y
dnf install intel-hpckit -y
tee > /tmp/oneAPI.repo << EOF
[oneAPI]
name=Intel® oneAPI repository
baseurl=https://yum.repos.intel.com/oneapi
enabled=1
gpgcheck=1
repo_gpgcheck=1
gpgkey=https://yum.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB
EOF
sudo mv /tmp/oneAPI.repo /etc/yum.repos.d
dnf install intel-basekit -y
source /opt/intel/oneapi/setvars.sh --force
Then, I created a link:
sudo ln -s /opt/intel/oneapi/compiler/2024.2/bin/icpx /opt/intel/oneapi/compiler/2024.2/bin/icpc sudo ln -s /opt/intel/oneapi/compiler/2024.2/bin/icx /opt/intel/oneapi/compiler/2024.2/bin/icc
Then, it was good until the end, but there were errors with register classifier in the code like these:
../src_c/IMB_prototypes.h:636:22: error: ISO C++17 does not allow 'register' storage class specifier [-Wregister] 636 | long IMB_compute_crc(register char* buf, register size_t size); | ^~~~~~~~ ../src_c/IMB_prototypes.h:636:42: error: ISO C++17 does not allow 'register' storage class specifier [-Wregister] 636 | long IMB_compute_crc(register char* buf, register size_t size); | ^~~~~~~~ In file included from NBC/NBC_suite.cpp:52: ../src_c/IMB_prototypes.h:636:22: error: ISO C++17 does not allow 'register' storage class specifier [-Wregister] 636 | long IMB_compute_crc(register char* buf, register size_t size); | ^~~~~~~~ ../src_c/IMB_prototypes.h:636:42: error: ISO C++17 does not allow 'register' storage class specifier [-Wregister] 636 | long IMB_compute_crc(register char* buf, register size_t size); | ^~~~~~~~ In file included from RMA/RMA_suite.cpp:52: ../src_c/IMB_prototypes.h:636:22: error: ISO C++17 does not allow 'register' storage class specifier [-Wregister] 636 | long IMB_compute_crc(register char* buf, register size_t size); | ^~~~~~~~ ../src_c/IMB_prototypes.h:636:42: error: ISO C++17 does not allow 'register' storage class specifier [-Wregister] 636 | long IMB_compute_crc(register char* buf, register size_t size); | ^~~~~~~~
Since compiler does not do anything (at least GCC) with register and the program uses GCC to compile it, I assume this changes nothing, so I remove the register calls from these 2 files:
../src_c/IMB_prototypes.h:636:22 ../src_c/IMB_prototypes.h:636:42
Then, I compiled, and finally it
WORKED!
Then, simply run the benchmarks as usual:
mpirun -np 2 ./IMB-MPI1
Leave a Reply