You are not logged in.

#1 2017-09-13 15:48:33

kokoko3k
Member
Registered: 2008-11-14
Posts: 2,390

[SOLVED] Use a real X nvidia display after using the same gpu in a VM.

I'm having fun with KVM and gpu passthrough, i'm able to pass my gtx 750ti to the VM and even get it back and ready to use via bumblebee/primus, it works very well.
Main display is driven by the integrated intel gpu.

Now i'm trying to setup a second Xserver which is driven by the nvidia driver; it mostly works, but i get no 3d acceleration in there. (glxinfo says i'm using LLVMpipe).
(VM is switched off and primusrun still works)

some info about packages:

koko@Gozer# pacman -Q |grep 'mesa\|nvidia\|libgl'
lib32-libglvnd 0.2.999+g4ba53457-2
lib32-mesa 17.1.4-1
lib32-nvidia-utils 381.22-1
lib32-opencl-nvidia 381.22-1
libglvnd 0.2.999+g4ba53457-2
mesa 17.1.8-1
mesa-demos 8.3.0-2
mesa-vdpau 17.1.4-1
nvidia-dkms 381.22-4
nvidia-settings 381.22-2
nvidia-utils 381.22-1
opencl-nvidia 381.22-1

The X server configuration i use to start the display driven by nvidia:

#xgame.singlescreen.xorg.conf
Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0" 0 0
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
    Option         "Xinerama" "0"
    Option         "AutoAddGPU" "false" #serve per non abilitare la scheda intel (se abilitata nel bios)
EndSection

Section "Files"
EndSection

Section "ServerFlags"
    Option         "IgnoreABI" "True"
EndSection

Section "InputDevice"
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Device"

#   Driver      "vesa"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce 9500 GT"
    Option         "Coolbits" "1"
#    Option         "DynamicTwinView" "false"
#       Option                  "TripleBuffer" "1"
        Option                  "UseEvents" "true"
    BusID          "PCI:1:0:0"
    Screen          0
EndSection

Section "Screen"
    Identifier      "Screen0"
    Device          "Device0"
    Monitor         "Monitor0"
    DefaultDepth    24
    Option          "TwinView" "0"
    Option          "Stereo" "0"
    SubSection      "Display"
        Depth       24
    EndSubSection
        Option "ModeValidation" "AllowNon60hzmodesDFPModes, NoEDIDDFPMaxSizeCheck, NoVertRefreshCheck, NoHorizSyncCheck, NoDFPNativeResolutionCheck, NoMaxSizeCheck, NoMaxPClkCheck,  AllowNonEdidModes, NoEdidMaxPClkCheck"
EndSection

I thought that libglvnd would solve the issue of having multiple libgl* packages installed, but it does not.

I'm confident that it is only a matter of not using the right libraries, but how to proceed?

Thanks.

-EDIT-
Indeed, /var/log/Xorg.1.log:

[108953.733] 
X.Org X Server 1.19.3
Release Date: 2017-03-15
[108953.733] X Protocol Version 11, Revision 0
[108953.733] Build Operating System: Linux 4.9.11-1-ARCH x86_64 
[108953.733] Current Operating System: Linux Gozer 4.11.9-1-ARCH #1 SMP PREEMPT Wed Jul 5 18:23:08 CEST 2017 x86_64
[108953.733] Kernel command line: BOOT_IMAGE=../vmlinuz-linux root=/dev/disk/by-uuid/4aa28877-74e1-46fd-96ba-34bee1c457cd rw resume=/dev/sdb2 vga=normal intel_iommu=on initrd=../intel-ucode.img,../initramfs-linux.img
[108953.733] Build Date: 07 April 2017  05:42:48PM
[108953.733]  
[108953.733] Current version of pixman: 0.34.0
[108953.733] 	Before reporting problems, check http://wiki.x.org
	to make sure that you have the latest version.
[108953.733] Markers: (--) probed, (**) from config file, (==) default setting,
	(++) from command line, (!!) notice, (II) informational,
	(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[108953.733] (==) Log file: "/var/log/Xorg.1.log", Time: Wed Sep 13 18:04:13 2017
[108953.734] (++) Using config file: "xgame.singlescreen.xorg.conf"
[108953.734] (==) Using config directory: "/etc/X11/xorg.conf.d"
[108953.734] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
[108953.734] (==) ServerLayout "Layout0"
[108953.734] (**) |-->Screen "Screen0" (0)
[108953.734] (**) |   |-->Monitor "<default monitor>"
[108953.734] (**) |   |-->Device "Device0"
[108953.734] (==) No monitor specified for screen "Screen0".
	Using a default monitor configuration.
[108953.734] (**) |-->Input Device "Keyboard0"
[108953.734] (**) |-->Input Device "Mouse0"
[108953.734] (**) Option "Xinerama" "0"
[108953.734] (**) Option "IgnoreABI" "True"
[108953.734] (**) Option "AutoAddGPU" "false"
[108953.734] (**) Ignoring ABI Version
[108953.734] (==) Automatically adding devices
[108953.734] (==) Automatically enabling devices
[108953.734] (**) Not automatically adding GPU devices
[108953.734] (==) Automatically binding GPU devices
[108953.734] (==) Max clients allowed: 256, resource mask: 0x1fffff
[108953.734] (WW) The directory "/usr/share/fonts/cantarell" does not exist.
[108953.734] 	Entry deleted from font path.
[108953.746] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/truetype".
[108953.746] 	Entry deleted from font path.
[108953.746] 	(Run 'mkfontdir' on "/usr/share/fonts/truetype").
[108953.747] (WW) The directory "/usr/share/fonts/zevv-peep" does not exist.
[108953.747] 	Entry deleted from font path.
[108953.747] (**) FontPath set to:
	/usr/share/fonts/100dpi,
	/usr/share/fonts/75dpi,
	/usr/share/fonts/cyrillic,
	/usr/share/fonts/encodings,
	/usr/share/fonts/misc,
	/usr/share/fonts/TTF,
	/usr/share/fonts/util,
	/usr/share/fonts/misc/,
	/usr/share/fonts/TTF/,
	/usr/share/fonts/OTF/,
	/usr/share/fonts/Type1/,
	/usr/share/fonts/100dpi/,
	/usr/share/fonts/75dpi/
[108953.747] (==) ModulePath set to "/usr/lib/xorg/modules"
[108953.747] (WW) Hotplugging is on, devices using drivers 'kbd', 'mouse' or 'vmmouse' will be disabled.
[108953.747] (WW) Disabling Keyboard0
[108953.747] (WW) Disabling Mouse0
[108953.747] (II) Loader magic: 0x822d60
[108953.747] (II) Module ABI versions:
[108953.747] 	X.Org ANSI C Emulation: 0.4
[108953.747] 	X.Org Video Driver: 23.0
[108953.747] 	X.Org XInput driver : 24.1
[108953.747] 	X.Org Server Extension : 10.0
[108953.748] (++) using VT number 1

[108953.748] (II) systemd-logind: logind integration requires -keeptty and -keeptty was not provided, disabling logind integration
[108953.748] (II) xfree86: Adding drm device (/dev/dri/card0)
[108953.748] (EE) /dev/dri/card0: failed to set DRM interface version 1.4: Permission denied
[108953.749] (--) PCI:*(0:0:2:0) 8086:0412:1043:8534 rev 6, Mem @ 0xf7400000/4194304, 0xd0000000/268435456, I/O @ 0x0000f000/64, BIOS @ 0x????????/131072
[108953.749] (--) PCI: (0:1:0:0) 10de:1380:1043:84bb rev 162, Mem @ 0xf6000000/16777216, 0xe0000000/268435456, 0xf0000000/33554432, I/O @ 0x0000e000/128, BIOS @ 0x????????/524288
[108953.749] (WW) Open ACPI failed (/var/run/acpid.socket) (No such file or directory)
[108953.749] (II) LoadModule: "glx"
[108953.750] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
[108953.750] (II) Module glx: vendor="X.Org Foundation"
[108953.750] 	compiled for 1.19.3, module version = 1.0.0
[108953.750] 	ABI class: X.Org Server Extension, version 10.0
[108953.750] (II) LoadModule: "nvidia"
[108953.750] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
[108953.750] (II) Module nvidia: vendor="NVIDIA Corporation"
[108953.750] 	compiled for 4.0.2, module version = 1.0.0
[108953.750] 	Module class: X.Org Video Driver
[108953.750] (II) NVIDIA dlloader X Driver  381.22  Wed May  3 23:53:41 PDT 2017
[108953.750] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[108954.138] (II) Loading sub module "fb"
[108954.138] (II) LoadModule: "fb"
[108954.138] (II) Loading /usr/lib/xorg/modules/libfb.so
[108954.139] (II) Module fb: vendor="X.Org Foundation"
[108954.139] 	compiled for 1.19.3, module version = 1.0.0
[108954.139] 	ABI class: X.Org ANSI C Emulation, version 0.4
[108954.139] (II) Loading sub module "wfb"
[108954.139] (II) LoadModule: "wfb"
[108954.139] (II) Loading /usr/lib/xorg/modules/libwfb.so
[108954.139] (II) Module wfb: vendor="X.Org Foundation"
[108954.139] 	compiled for 1.19.3, module version = 1.0.0
[108954.139] 	ABI class: X.Org ANSI C Emulation, version 0.4
[108954.139] (II) Loading sub module "ramdac"
[108954.139] (II) LoadModule: "ramdac"
[108954.139] (II) Module "ramdac" already built-in
[108954.139] (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
[108954.139] (==) NVIDIA(0): RGB weight 888
[108954.139] (==) NVIDIA(0): Default visual is TrueColor
[108954.139] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
[108954.139] (**) NVIDIA(0): Option "Stereo" "0"
[108954.139] (**) NVIDIA(0): Option "ModeValidation" "AllowNon60hzmodesDFPModes, NoEDIDDFPMaxSizeCheck, NoVertRefreshCheck, NoHorizSyncCheck, NoDFPNativeResolutionCheck, NoMaxSizeCheck, NoMaxPClkCheck,  AllowNonEdidModes, NoEdidMaxPClkCheck"
[108954.139] (**) NVIDIA(0): Stereo disabled by request
[108954.139] (**) NVIDIA(0): Option "Coolbits" "1"
[108954.140] (**) NVIDIA(0): Enabling 2D acceleration
[108954.140] (EE) NVIDIA(0): Failed to initialize the GLX module; please check in your X
[108954.140] (EE) NVIDIA(0):     log file that the GLX module has been loaded in your X
[108954.140] (EE) NVIDIA(0):     server, and that the module is the NVIDIA GLX module.  If
[108954.140] (EE) NVIDIA(0):     you continue to encounter problems, Please try
[108954.140] (EE) NVIDIA(0):     reinstalling the NVIDIA driver.
[108954.483] (--) NVIDIA(0): Valid display device(s) on GPU-0 at PCI:1:0:0
[108954.483] (--) NVIDIA(0):     CRT-0
[108954.483] (--) NVIDIA(0):     DFP-0 (boot)
[108954.483] (--) NVIDIA(0):     DFP-1
[108954.483] (--) NVIDIA(0):     DFP-2
[108954.484] (II) NVIDIA(0): NVIDIA GPU GeForce GTX 750 Ti (GM107-A) at PCI:1:0:0 (GPU-0)
[108954.484] (--) NVIDIA(0): Memory: 2097152 kBytes
[108954.484] (--) NVIDIA(0): VideoBIOS: 82.07.32.00.20
[108954.484] (II) NVIDIA(0): Detected PCI Express Link width: 16X
[108954.485] (--) NVIDIA(GPU-0): CRT-0: disconnected
[108954.485] (--) NVIDIA(GPU-0): CRT-0: 400.0 MHz maximum pixel clock
[108954.485] (--) NVIDIA(GPU-0): 
[108954.492] (--) NVIDIA(GPU-0): Samsung SyncMaster (DFP-0): connected
[108954.492] (--) NVIDIA(GPU-0): Samsung SyncMaster (DFP-0): Internal TMDS
[108954.492] (--) NVIDIA(GPU-0): Samsung SyncMaster (DFP-0): 330.0 MHz maximum pixel clock
[108954.492] (--) NVIDIA(GPU-0): 
[108954.492] (--) NVIDIA(GPU-0): DFP-1: disconnected
[108954.492] (--) NVIDIA(GPU-0): DFP-1: Internal TMDS
[108954.492] (--) NVIDIA(GPU-0): DFP-1: 165.0 MHz maximum pixel clock
[108954.492] (--) NVIDIA(GPU-0): 
[108954.492] (--) NVIDIA(GPU-0): DFP-2: disconnected
[108954.492] (--) NVIDIA(GPU-0): DFP-2: Internal TMDS
[108954.492] (--) NVIDIA(GPU-0): DFP-2: 330.0 MHz maximum pixel clock
[108954.492] (--) NVIDIA(GPU-0): 
[108954.492] (WW) NVIDIA(GPU-0): Unrecognized ModeValidation token "AllowNon60hzmodesDFPModes";
[108954.492] (WW) NVIDIA(GPU-0):     ignoring.
[108954.492] (WW) NVIDIA(GPU-0): Unrecognized ModeValidation token "NoEDIDDFPMaxSizeCheck";
[108954.492] (WW) NVIDIA(GPU-0):     ignoring.
[108954.492] (WW) NVIDIA(GPU-0): Unrecognized ModeValidation token
[108954.492] (WW) NVIDIA(GPU-0):     "NoDFPNativeResolutionCheck"; ignoring.
[108954.492] (**) NVIDIA(GPU-0): Mode Validation Overrides for Samsung SyncMaster (DFP-0):
[108954.492] (**) NVIDIA(GPU-0):     NoMaxSizeCheck
[108954.492] (**) NVIDIA(GPU-0):     NoMaxPClkCheck
[108954.492] (**) NVIDIA(GPU-0):     NoEdidMaxPClkCheck
[108954.492] (**) NVIDIA(GPU-0):     NoHorizSyncCheck
[108954.492] (**) NVIDIA(GPU-0):     NoVertRefreshCheck
[108954.492] (**) NVIDIA(GPU-0):     AllowNonEdidModes
[108954.498] (==) NVIDIA(0): 
[108954.498] (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
[108954.498] (==) NVIDIA(0):     will be used as the requested mode.
[108954.498] (==) NVIDIA(0): 
[108954.498] (II) NVIDIA(0): Validated MetaModes:
[108954.498] (II) NVIDIA(0):     "DFP-0:nvidia-auto-select"
[108954.498] (II) NVIDIA(0): Virtual screen size determined to be 1280 x 1024
[108954.508] (--) NVIDIA(0): DPI set to (85, 86); computed from "UseEdidDpi" X config
[108954.508] (--) NVIDIA(0):     option
[108954.508] (--) Depth 24 pixmap format is 32 bpp
[108954.509] (II) NVIDIA: Using 12288.00 MB of virtual memory for indirect memory
[108954.509] (II) NVIDIA:     access.
[108954.511] (II) NVIDIA(0): ACPI: failed to connect to the ACPI event daemon; the daemon
[108954.511] (II) NVIDIA(0):     may not be running or the "AcpidSocketPath" X
[108954.511] (II) NVIDIA(0):     configuration option may not be set correctly.  When the
[108954.511] (II) NVIDIA(0):     ACPI event daemon is available, the NVIDIA X driver will
[108954.511] (II) NVIDIA(0):     try to use it to receive ACPI event notifications.  For
[108954.511] (II) NVIDIA(0):     details, please see the "ConnectToAcpid" and
[108954.511] (II) NVIDIA(0):     "AcpidSocketPath" X configuration options in Appendix B: X
[108954.511] (II) NVIDIA(0):     Config Options in the README.
[108954.524] (II) NVIDIA(0): Setting mode "DFP-0:nvidia-auto-select"
[108954.547] (==) NVIDIA(0): Disabling shared memory pixmaps
[108954.547] (==) NVIDIA(0): Backing store enabled
[108954.547] (==) NVIDIA(0): Silken mouse enabled
[108954.547] (==) NVIDIA(0): DPMS enabled
[108954.547] (WW) NVIDIA(0): Option "UseEvents" is not used
[108954.547] (WW) NVIDIA(0): Option "TwinView" is not used
[108954.547] (II) Loading sub module "dri2"
[108954.547] (II) LoadModule: "dri2"
[108954.547] (II) Module "dri2" already built-in
[108954.547] (II) NVIDIA(0): [DRI2] Setup complete
[108954.547] (II) NVIDIA(0): [DRI2]   VDPAU driver: nvidia
[108954.547] (--) RandR disabled
[108954.549] (II) AIGLX: Screen 0 is not DRI2 capable
[108954.549] (EE) AIGLX: reverting to software rendering
[108954.563] (II) IGLX: enabled GLX_MESA_copy_sub_buffer
[108954.563] (II) IGLX: Loaded and initialized swrast
[108954.563] (II) GLX: Initialized DRISWRAST GL provider for screen 0
[108954.679] (II) config/udev: Adding input device Power Button (/dev/input/event3)

...So i added the following to the xorg configuration:

Section "Files"
    ModulePath   "@LIBDIR@/nvidia/xorg" 
    ModulePath   "@LIBDIR@/xorg/modules"
EndSection

But that way xorg does not even start anymore:

[..]
[109132.406] (**) ModulePath set to "@LIBDIR@/nvidia/xorg,@LIBDIR@/xorg/modules"
[..]
[109132.406] (EE) /dev/dri/card0: failed to set DRM interface version 1.4: Permission denied
[109132.407] (--) PCI:*(0:0:2:0) 8086:0412:1043:8534 rev 6, Mem @ 0xf7400000/4194304, 0xd0000000/268435456, I/O @ 0x0000f000/64, BIOS @ 0x????????/131072
[109132.407] (--) PCI: (0:1:0:0) 10de:1380:1043:84bb rev 162, Mem @ 0xf6000000/16777216, 0xe0000000/268435456, 0xf0000000/33554432, I/O @ 0x0000e000/128, BIOS @ 0x????????/524288
[109132.408] (WW) Open ACPI failed (/var/run/acpid.socket) (No such file or directory)
[109132.408] (II) LoadModule: "glx"
[109132.408] (II) UnloadModule: "glx"
[109132.408] (II) Unloading glx
[109132.408] (EE) Failed to load module "glx" (invalid argument(s) to LoadModule(), 1)
[109132.408] (II) LoadModule: "nvidia"
[109132.408] (II) UnloadModule: "nvidia"
[109132.408] (II) Unloading nvidia
[109132.408] (EE) Failed to load module "nvidia" (invalid argument(s) to LoadModule(), 1)
[109132.408] (EE) No drivers available.
[109132.408] (EE) 
Fatal server error:
[109132.408] (EE) no screens found(EE) 
[..]

Indeed, i've just /dev/dri/card0 (i suppose the working intel i915).

It is not crucial to me, primusrun works good, but I'd like to understant what's happening.

Last edited by kokoko3k (2017-09-14 07:43:45)


Help me to improve ssh-rdp !
Retroarch User? Try my koko-aio shader !

Offline

#2 2017-09-13 16:48:37

R00KIE
Forum Fellow
From: Between a computer and a chair
Registered: 2008-09-14
Posts: 4,734

Re: [SOLVED] Use a real X nvidia display after using the same gpu in a VM.

Doesn't the nvidia driver try to detect when it's being used in a virtualized environment and then refuses to work? Or maybe that is no long the case or is a windows only problem [1].

[1] https://www.redhat.com/archives/vfio-us … 00099.html


R00KIE
Tm90aGluZyB0byBzZWUgaGVyZSwgbW92ZSBhbG9uZy4K

Offline

#3 2017-09-14 07:41:06

kokoko3k
Member
Registered: 2008-11-14
Posts: 2,390

Re: [SOLVED] Use a real X nvidia display after using the same gpu in a VM.

Yes, the nvidia driver fails, but you can hide kvm to the nvidia driver by using:

   <kvm>
      <hidden state='on'/>
    </kvm>

In the libvirt qemu xml file, that is not my issue.

Anyway i've solved it, maybe i was not that clear, so i'll explain.
My Windows 7 VM does work good with gpu passthrough, but to do that, i've enabled both integrated and discrete GPU by the "uefi bios"  (what's the correct name nowdays?), and set the primary card to be the integrated one.
Next, the system booted with vfio-pci module configured to "stub" the nvidia device.
As a result the nvidia modules where not loaded.

Still, primusrun in the host linux os works and i can use my windows 7 vm AND primusrun (not at the same time, of course), the way i did that was simply to "rmmod nvidia nvidia-modeset" and "modprobe vfio-pci" before starting the VM.
After stopping the VM, i needed to remove vfio-pci to make the nvidia gpu device available to the host os again.
At thet point, primusrun worked, but configuring a second Xserver display did not.

What was missing, was the device node /dev/dri/card1, needed for the nvidia driver to drive the real display, but un-needed to drive the "virtual" Xorg display needed by primusrun.

Long story short, after stopping the vm, to use a real nvidia Xorg server, i needed to

"modprobe nvidia-drm" too; that module creates the /dev/dri/card1 device node,

allowing me to use a real X display.


-EDIT-
Completely changed topic to make it clearer, now that i found the issue.

Last edited by kokoko3k (2017-09-14 07:44:15)


Help me to improve ssh-rdp !
Retroarch User? Try my koko-aio shader !

Offline

Board footer

Powered by FluxBB