You are not logged in.
Hello, I'm trying to compile some project on archlinux, which, requires nvhpc(https://developer.nvidia.com/hpc-sdk-downloads)
And in some cases, nvhpc will report error like:
nvc-Error-x86_64-v2 not supported - x86_64-v3 is minimum supportedI'm using i7-11700KF, so I'm sure that my CPU is new enough to support x86_64-v3, but it seems like nvc trying to compile as v2.
Such error won't show on Ubuntu 22.04, only on archlinux. I don't know what cause this problem.
I'm trying to compile neuron simulator:https://nrn.readthedocs.io/en/latest/coreneuron/installation.html#installing-from-source
Which encountered this problem.
But, this project is way to big and complex, and I found another small project(mpi4py) can also produce this error:
1. install nvhpc:
wget https://developer.download.nvidia.com/hpc-sdk/24.11/nvhpc_2024_2411_Linux_x86_64_cuda_12.6.tar.gz
tar xpzf nvhpc_2024_2411_Linux_x86_64_cuda_12.6.tar.gz
nvhpc_2024_2411_Linux_x86_64_cuda_12.6/install
export PATH=/opt/nvidia/hpc_sdk/Linux_x86_64/24.11/compilers/bin:$PATH
export MANPATH=/opt/nvidia/hpc_sdk/Linux_x86_64/24.11/compilers/man:$MANPATH
export PATH=/opt/nvidia/hpc_sdk/Linux_x86_64/24.11/comm_libs/mpi/bin:$PATH2. make a python venv and compile mpi4py in it:
pip instal python-pip
python -m venv pyvenv
source pyvenv/bin/activate
pip install mpi4pyAnd,we can get a error output:
(pyvenv) [root@72a5bfd88235 ~]# pip install mpi4py
Looking in indexes: https://mirrors.pku.edu.cn/pypi/web/simple
Collecting mpi4py
Using cached https://mirrors.pku.edu.cn/pypi/web/packages/08/34/8499a92a387d24d0092c38089f8195f13c5c76f0f814126af3fe363e5636/mpi4py-4.0.1.tar.gz (466 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Installing backend dependencies ... done
Preparing metadata (pyproject.toml) ... done
Building wheels for collected packages: mpi4py
Building wheel for mpi4py (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for mpi4py (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [83 lines of output]
running bdist_wheel
running build
running build_src
using Cython 3.0.11
cythonizing 'src/mpi4py/MPI.pyx' -> 'src/mpi4py/MPI.c'
running build_py
creating build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/__init__.py -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/__main__.py -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/bench.py -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/run.py -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/typing.py -> build/lib.linux-x86_64-cpython-313/mpi4py
creating build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/__init__.py -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/__main__.py -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/_base.py -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/_core.py -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/aplus.py -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/pool.py -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/server.py -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/util.py -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
creating build/lib.linux-x86_64-cpython-313/mpi4py/util
copying src/mpi4py/util/__init__.py -> build/lib.linux-x86_64-cpython-313/mpi4py/util
copying src/mpi4py/util/dtlib.py -> build/lib.linux-x86_64-cpython-313/mpi4py/util
copying src/mpi4py/util/pkl5.py -> build/lib.linux-x86_64-cpython-313/mpi4py/util
copying src/mpi4py/util/pool.py -> build/lib.linux-x86_64-cpython-313/mpi4py/util
copying src/mpi4py/util/sync.py -> build/lib.linux-x86_64-cpython-313/mpi4py/util
copying src/mpi4py/MPI.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/__init__.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/__main__.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/bench.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/run.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/typing.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/py.typed -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/MPI.pxd -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/__init__.pxd -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/libmpi.pxd -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/MPI.h -> build/lib.linux-x86_64-cpython-313/mpi4py
copying src/mpi4py/MPI_api.h -> build/lib.linux-x86_64-cpython-313/mpi4py
creating build/lib.linux-x86_64-cpython-313/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi4py.h -> build/lib.linux-x86_64-cpython-313/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/pycapi.h -> build/lib.linux-x86_64-cpython-313/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi4py.i -> build/lib.linux-x86_64-cpython-313/mpi4py/include/mpi4py
copying src/mpi4py/include/mpi4py/mpi.pxi -> build/lib.linux-x86_64-cpython-313/mpi4py/include/mpi4py
copying src/mpi4py/futures/__init__.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/__main__.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/_base.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/_core.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/aplus.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/pool.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/server.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/futures/util.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/futures
copying src/mpi4py/util/__init__.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/util
copying src/mpi4py/util/dtlib.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/util
copying src/mpi4py/util/pkl5.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/util
copying src/mpi4py/util/pool.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/util
copying src/mpi4py/util/sync.pyi -> build/lib.linux-x86_64-cpython-313/mpi4py/util
running build_ext
MPI configuration: [mpi] from 'mpi.cfg'
MPI C compiler: /opt/nvidia/hpc_sdk/Linux_x86_64/24.11/comm_libs/mpi/bin/mpicc
MPI C++ compiler: /opt/nvidia/hpc_sdk/Linux_x86_64/24.11/comm_libs/mpi/bin/mpicxx
checking for MPI compile and link ...
/opt/nvidia/hpc_sdk/Linux_x86_64/24.11/comm_libs/mpi/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O3 -Wall -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=3 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -g -ffile-prefix-map=/build/python/src=/usr/src/debug/python -flto=auto -ffat-lto-objects -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=3 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -g -ffile-prefix-map=/build/python/src=/usr/src/debug/python -flto=auto -march=x86-64 -mtune=generic -O3 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=3 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -g -ffile-prefix-map=/build/python/src=/usr/src/debug/python -flto=auto -fPIC -I/root/pyvenv/include -I/usr/include/python3.13 -c _configtest.c -o _configtest.o
nvc-Error-x86_64-v2 not supported - x86_64-v3 is minimum supported
nvc-Error-x86_64-v2 not supported - x86_64-v3 is minimum supported
nvc-Error-x86_64-v2 not supported - x86_64-v3 is minimum supported
failure.
removing: _configtest.c _configtest.o
error: Cannot compile MPI programs. Check your configuration!!!
Installing mpi4py requires a working MPI implementation.
If you are running on a supercomputer or cluster, check with
the system administrator or refer to the system user guide.
Otherwise, if you are running on a laptop or desktop computer,
your may be missing the MPICH or Open MPI development package:
* On Fedora/RHEL systems, run:
$ sudo dnf install mpich-devel # for MPICH
$ sudo dnf install openmpi-devel # for Open MPI
* On Debian/Ubuntu systems, run:
$ sudo apt install libmpich-dev # for MPICH
$ sudo apt install libopenmpi-dev # for Open MPI
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for mpi4py
Failed to build mpi4py
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (mpi4py)I'm thinking if it's
-march=x86-64 -mtune=generic that causes this problem.
maybe the compiler somehow think it need to compile for x86-64-v2
but why this error only shows on archlinux, but not on ubuntu?
Offline
The default /etc/make.conf/etc/makepkg.conf is set to x86-64. You can further read what's Arch's plan on x86_64-v3 here.
Last edited by impossibleveins23 (2025-01-22 13:19:38)
* Good formatted problem description will cause good and quick solution ![]()
* Please don't forget to mark as [SOLVED].
Offline
The default /etc/make.conf is set to x86-64. You can further read what's Arch's plan on x86_64-v3 here.
1. I didn't see /etc/make.conf on my PC, and I google it, only found gentoo has /etc/make.conf. I created it, and wrote:
COMMON_FLAGS="-march=native -O2 -pipe"
CFLAGS="${COMMON_FLAGS}"
CXXFLAGS="${COMMON_FLAGS}"well, same error.
2. if you mean /etc/makepkg.conf, well, my project didn't compile using makepkg. Though I changed it to `-march=native` anyway, but the same error still occurs....
Offline
Why don't you just get it from the AUR instead of compiling from source?
Offline
Try building https://aur.archlinux.org/packages/nvhpc .
If you're new to arch , start with reading https://wiki.archlinux.org/title/Arch_build_system and https://wiki.archlinux.org/title/Arch_User_Repository
Moderator Note :
Moving to AUR issues
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Online
If you must use pip in venev, You can try passing something like:
pip install <PKG> --config-settings "CFLAGS=-march=x86_64-v3"or
"CFLAGS=-march=x86_64-v3" pip install <PKG> * Good formatted problem description will cause good and quick solution ![]()
* Please don't forget to mark as [SOLVED].
Offline
Try building https://aur.archlinux.org/packages/nvhpc .
If you're new to arch , start with reading https://wiki.archlinux.org/title/Arch_build_system and https://wiki.archlinux.org/title/Arch_User_Repository
Moderator Note :
Moving to AUR issues
Sorry, but maybe you didn't understand my question.
I have no trouble getting nvhpc.
but, my project need nvhpc to compile, and the project isn't on aur and it shouldn't on aur.
what I really want to build is https://github.com/neuronsimulator/nrn
Offline
If you must use pip in venev, You can try passing something like:
pip install <PKG> --config-settings "CFLAGS=-march=x86_64-v3"or
"CFLAGS=-march=x86_64-v3" pip install <PKG>
Sorry, but I might not have described my problem right.
I want to compile neuron simulator but get into trouble:https://github.com/neuronsimulator/nrn/issues/3313
mpi4py is only another project that has similar error.
Offline
For clarity :
where do your nvhpc & mpi4py come from ?
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Online
impossibleveins23 wrote:If you must use pip in venev, You can try passing something like:
pip install <PKG> --config-settings "CFLAGS=-march=x86_64-v3"or
"CFLAGS=-march=x86_64-v3" pip install <PKG>Sorry, but I might not have described my problem right.
I want to compile neuron simulator but get into trouble:https://github.com/neuronsimulator/nrn/issues/3313
mpi4py is only another project that has similar error.
Why not install neuron simulator through python?
Offline