You are not logged in.
Hi, I would like to switch which hardware video acceleration drivers are used when I switch which video graphics card I am using. However, I need to first figure out why setting LIBVA_DRIVER_NAME and VDPAU_DRIVER to the recommended values for NVIDIA causes a floating point exception. I'm hoping that someone can provide some guidance on these matters.
The hardware is a Dell XPS 15 7590 (2019) with integrated Intel UHD Graphics 630 and a discrete NVIDIA GeForce GTX 1650 4GB GDDR5.
I was able to successfully setup NVIDIA Optimus with bumblebee using information found on the Dell XPS 15 9570 wiki article and the final solution posted in this forum post. So, when I turn on my laptop and X gets started, the NVIDIA card is not enabled and the nvidia drivers are not loaded. The system is using the integrated Intel graphics. I can then run an enableGpu.sh script, which enables the NVIDIA card and loads the nvidia drivers. A disableGpu.sh then reverses the situation. powertop shows that there is definitely a difference in power consumption between the two scenarios.
I was also able to setup hardware video acceleration following the guidance in the Arch Wiki article. I have the intel-media-driver, nvidia-utils, libvdpau-va-gl, libva-vdpau-driver, and vdpauinfo packages installed. In order for it to work though, I have to set the LIBVA_DRIVER_NAME and VDPAU_DRIVER variables. I am currently setting those in favor of the Intel graphics in my /etc/environment file:
LIBVA_DRIVER_NAME=iHD
VDPAU_DRIVER=va_gl
When I run vainfo, I get the following output:
$ vainfo
vainfo: VA-API version: 1.5 (libva 2.5.0)
vainfo: Driver version: Intel iHD driver - 1.0.0
vainfo: Supported profile and entrypoints
VAProfileNone : VAEntrypointVideoProc
VAProfileNone : VAEntrypointStats
VAProfileMPEG2Simple : VAEntrypointVLD
VAProfileMPEG2Simple : VAEntrypointEncSlice
VAProfileMPEG2Main : VAEntrypointVLD
VAProfileMPEG2Main : VAEntrypointEncSlice
VAProfileH264Main : VAEntrypointVLD
VAProfileH264Main : VAEntrypointEncSlice
VAProfileH264Main : VAEntrypointFEI
VAProfileH264Main : VAEntrypointEncSliceLP
VAProfileH264High : VAEntrypointVLD
VAProfileH264High : VAEntrypointEncSlice
VAProfileH264High : VAEntrypointFEI
VAProfileH264High : VAEntrypointEncSliceLP
VAProfileVC1Simple : VAEntrypointVLD
VAProfileVC1Main : VAEntrypointVLD
VAProfileVC1Advanced : VAEntrypointVLD
VAProfileJPEGBaseline : VAEntrypointVLD
VAProfileJPEGBaseline : VAEntrypointEncPicture
VAProfileH264ConstrainedBaseline: VAEntrypointVLD
VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice
VAProfileH264ConstrainedBaseline: VAEntrypointFEI
VAProfileH264ConstrainedBaseline: VAEntrypointEncSliceLP
VAProfileVP8Version0_3 : VAEntrypointVLD
VAProfileVP8Version0_3 : VAEntrypointEncSlice
VAProfileHEVCMain : VAEntrypointVLD
VAProfileHEVCMain : VAEntrypointEncSlice
VAProfileHEVCMain : VAEntrypointFEI
VAProfileHEVCMain10 : VAEntrypointVLD
VAProfileHEVCMain10 : VAEntrypointEncSlice
VAProfileVP9Profile0 : VAEntrypointVLD
VAProfileVP9Profile2 : VAEntrypointVLD
I then tested it out by playing back a video using mpv with the following results:
$ mpv --hwdec=auto Videos/GX010323.MP4
Playing: Videos/GX010323.MP4
[ffmpeg/demuxer] mov,mp4,m4a,3gp,3g2,mj2: Using non-standard frame rate 59/1
(+) Video --vid=1 (*) (hevc 3840x2160 59.940fps)
(+) Audio --aid=1 --alang=eng (*) (aac 2ch 48000Hz)
[vo/gpu/vaapi-egl] vaAcquireSurfaceHandle() failed (invalid VASurfaceID)
[vo/gpu/vaapi-egl] vaAcquireSurfaceHandle() failed (invalid VASurfaceID)
[vo/gpu/vaapi-egl] vaAcquireSurfaceHandle() failed (invalid VASurfaceID)
[vo/gpu/vaapi-egl] vaAcquireSurfaceHandle() failed (invalid VASurfaceID)
[vo/gpu/vaapi-egl] vaAcquireSurfaceHandle() failed (invalid VASurfaceID)
[vo/gpu/vaapi-egl] vaAcquireSurfaceHandle() failed (invalid VASurfaceID)
[vo/gpu/vaapi-egl] vaAcquireSurfaceHandle() failed (invalid VASurfaceID)
[vo/gpu/cuda-nvdec] cu->cuInit(0) failed -> CUDA_ERROR_NO_DEVICE: no CUDA-capable device is detected
[vo/gpu/cuda-nvdec] cu->cuCtxPopCurrent(&dummy) failed -> CUDA_ERROR_NOT_INITIALIZED: initialization error
Using hardware decoding (vaapi).
VO: [gpu] 3840x2160 vaapi[nv12]
AO: [pulse] 48000Hz stereo 2ch float
AV: 00:00:25 / 00:00:25 (99%) A-V: 0.000
Exiting... (End of file)
It does seem to be using hardware video acceleration, but there are also some errors mentioned.
When I switch to NVIDIA graphics, I change the /etc/environment file to one with the following variables:
LIBVA_DRIVER_NAME=vdpau
VDPAU_DRIVER=nvidia
I can confirm that the nvidia drivers are loaded with
$ lsmod | grep nvidia
nvidia_uvm 978944 0
nvidia_drm 57344 0
nvidia_modeset 1118208 1 nvidia_drm
nvidia 18878464 2 nvidia_uvm,nvidia_modeset
ipmi_msghandler 69632 1 nvidia
drm_kms_helper 225280 2 nvidia_drm,i915
drm 499712 9 drm_kms_helper,nvidia_drm,i915
I then source the new environment file. When I then run vainfo, I get the following exception:
$ vainfo
Floating point exception (core dumped)
Running vdpauinfo also produces an error:
$ vdpauinfo
display: :1 screen: 0
Error creating VDPAU device: 1
Obviously, when I tried to playback a video with mpv, I also got an error and the video did not even start.
$ mpv --hwdec=auto Videos/GX010323.MP4
Playing: Videos/GX010323.MP4
[ffmpeg/demuxer] mov,mp4,m4a,3gp,3g2,mj2: Using non-standard frame rate 59/1
(+) Video --vid=1 (*) (hevc 3840x2160 59.940fps)
(+) Audio --aid=1 --alang=eng (*) (aac 2ch 48000Hz)
Xlib: extension "NV-GLX" missing on display ":1".
Xlib: extension "NV-GLX" missing on display ":1".
Floating point exception (core dumped)
However, if I manually set the hwdec value to nvdec-copy, the video plays back flawlessly without any errors:
$ mpv --hwdec=nvdec-copy Videos/GX010323.MP4
Playing: Videos/GX010323.MP4
[ffmpeg/demuxer] mov,mp4,m4a,3gp,3g2,mj2: Using non-standard frame rate 59/1
(+) Video --vid=1 (*) (hevc 3840x2160 59.940fps)
(+) Audio --aid=1 --alang=eng (*) (aac 2ch 48000Hz)
Using hardware decoding (nvdec-copy).
VO: [gpu] 3840x2160 nv12
AO: [pulse] 48000Hz stereo 2ch float
AV: 00:00:25 / 00:00:25 (99%) A-V: 0.000 Dropped: 1
Exiting... (End of file)
My first question is does anyone know why the NVIDIA configuration causes libva to fail? Second, is there a better place to set the LIBVA_DRIVER_NAME and VDPAU_DRIVER, rather than in /etc/environment? The system doesn't seem to be aware that these values get updated. When in the terminal, I have to source /etc/environment just to get the new values, and if I go to another terminal, it has the original initial values. I would like for the system to immediately be aware of changes to those variables if possible.
Thank you for any help!
Offline