You are not logged in.
nvidia-xrun, graphic hybrid mode failure with egpu
Hello, I am trying to set up hybrid mode. I have a NUC with integrated intel graphic chip. I connect to it via thunderbolt an Akitio Node with a Nvidia GTX 1080 card. The NUC seems to be primus.
Just for history, I tried different configurations without success. With optimus-manager, I managed to boot, with the display in the external egpu, but when xorg launches, it goes to the internal intel card and even dies.
I have a couple of custom scripts to enable/disable nvidia (blacklist nouveau, nvidia, etc., and load nvidia at request). When nvidia is loaded, it is detected by nvidia-smi. lxrandr still sees only the intel card, but this is to be expected.
Now, I thought my best option is to make nvidia-xrun work. Actually, I solved many issues, but the result is that, instead of loading a secondary xorg server in the external gpu, it loads it still on the internal intel gpu. So, I don't know how to tell it to use the external monitor. In the log, it even detects the external Philips monitor, but for some reasons uses the nvidia driver with the internal intel chip and outputs to the monitor connected to the intel card.
My nvidia-xorg.conf
Section "Files"
ModulePath "/usr/lib/nvidia"
ModulePath "/usr/lib32/nvidia"
ModulePath "/usr/lib32/nvidia/xorg/modules"
ModulePath "/usr/lib32/xorg/modules"
ModulePath "/usr/lib64/nvidia/xorg/modules"
ModulePath "/usr/lib64/nvidia/xorg"
ModulePath "/usr/lib64/xorg/modules"
EndSection
Section "ServerLayout"
Identifier "layout"
Screen 0 "nvidia"
Inactive "intel"
EndSection
Section "Monitor"
Identifier "nvidia"
VendorName "Monitor Name"
ModelName "Monitor Model"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "PCI:7:0:0"
Option "Coolbits" "28"
Option "AllowExternalGpus"
Option "ConnectedMonitor" "DFP-1"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
Option "AllowEmptyInitialConfiguration"
Monitor "nvidia"
EndSection
Section "Monitor"
Identifier "intel"
VendorName "Monitor Name"
ModelName "Monitor Model"
EndSection
Section "Device"
Identifier "intel"
Driver "modesetting"
BusID "PCI:0:2:0"
EndSection
Section "Screen"
Identifier "intel"
Device "intel"
Monitor "intel"
EndSection
And after I switch to another console (Ctrl-Alt-F2) and launch the command nvidia-xrun xterm, it loads an xterm still in the same monitor and not onto the external one. Here is the log.
It seems not possible to copy the entire code here. I just show some snippets.
1766.985 (++) Using config file: "/etc/X11/nvidia-xorg.conf"
1766.986 (==) ServerLayout "layout"
1766.986 (**) |-->Screen "nvidia" (0)
1766.986 (**) | |-->Monitor "nvidia"
1766.986 (**) | |-->Device "nvidia"
1766.986 (**) | |-->GPUDevice "intel"
1766.986 (**) |-->Inactive Device "intel"
1766.986 (==) Automatically adding devices
1766.986 (==) Automatically enabling devices
1766.986 (==) Automatically adding GPU devices
1766.986 (==) Automatically binding GPU devices
HERE IT MESSES UP; IT TAKES THE INTERNAL INTEL CARD AND USES THE NVIDIA DRIVER ON IT:
1767.111 (II) modeset(G0): glamor X acceleration enabled on Mesa DRI Intel(R) HD Graphics (Coffeelake 3x8 GT3)
1767.111 (II) modeset(G0): glamor initialized
1767.156 (II) modeset(G0): Output DP-1-1 using monitor section nvidia
1767.156 (II) modeset(G0): Output DP-1-2 has no monitor section
1767.183 (II) modeset(G0): Output HDMI-1-1 has no monitor section
1767.228 (II) modeset(G0): EDID for output DP-1-1
1767.335 (II) NVIDIA: Using 24576.00 MB of virtual memory for indirect memory
1767.335 (II) NVIDIA: access.
1767.382 (II) NVIDIA(0): Setting mode "DFP-1:nvidia-auto-select"
1767.427 (==) NVIDIA(0): Disabling shared memory pixmaps
1767.427 (==) NVIDIA(0): Backing store enabled
1767.427 (==) NVIDIA(0): Silken mouse disabled
1767.427 (==) NVIDIA(0): DPMS enabled
1767.427 (II) Loading sub module "dri2"
1767.427 (II) LoadModule: "dri2"
1767.427 (II) Module "dri2" already built-in
1767.427 (II) NVIDIA(0): DRI2 Setup complete
1767.427 (II) NVIDIA(0): DRI2 VDPAU driver: nvidia
SIC!
1767.428 (II) GLX: Another vendor is already registered for screen 0
1767.428 (II) Initializing extension XFree86-VidModeExtension
1767.428 (II) Initializing extension XFree86-DGA
1767.428 (II) Initializing extension XFree86-DRI
1767.428 (II) Initializing extension DRI2
1767.428 (II) Initializing extension NV-GLX
1767.428 (II) Initializing extension NV-CONTROL
1767.428 (II) Initializing extension XINERAMA
1767.435 (II) modeset(G0): Damage tracking initialized
1767.494 (II) config/udev: Adding input device Power Button (/dev/input/event2)
1767.494 (**) Power Button: Applying InputClass "evdev keyboard catchall"
1767.494 (II) LoadModule: "evdev"
1767.494 (II) Loading /usr/lib64/xorg/modules/input/evdev_drv.so
1767.494 (II) Module evdev: vendor="X.Org Foundation"
1767.494 compiled for 1.20.0, module version = 2.10.6
1767.494 Module class: X.Org XInput Driver
1767.494 ABI class: X.Org XInput driver, version 24.1
1767.494 (II) Using input driver 'evdev' for 'Power Button'
1767.564 (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=3 (/dev/input/event18)
1767.564 (II) No input driver specified, ignoring this device.
1767.564 (II) This device may have been added with another device file.
1767.565 (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=7 (/dev/input/event19)
1767.565 (II) No input driver specified, ignoring this device.
1767.565 (II) This device may have been added with another device file.
1767.565 (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=8 (/dev/input/event20)
1767.565 (II) No input driver specified, ignoring this device.
1767.565 (II) This device may have been added with another device file.
1767.566 (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=9 (/dev/input/event21)
1767.566 (II) No input driver specified, ignoring this device.
1767.566 (II) This device may have been added with another device file.
1767.566 (II) config/udev: Adding input device HDA Intel PCH HDMI/DP,pcm=3 (/dev/input/event13)
1767.566 (II) No input driver specified, ignoring this device.
1767.566 (II) This device may have been added with another device file.
1767.566 (II) config/udev: Adding input device HDA Intel PCH HDMI/DP,pcm=7 (/dev/input/event14)
1767.566 (II) No input driver specified, ignoring this device.
1767.566 (II) This device may have been added with another device file.
1767.566 (II) config/udev: Adding input device HDA Intel PCH HDMI/DP,pcm=8 (/dev/input/event15)
1767.566 (II) No input driver specified, ignoring this device.
1767.566 (II) This device may have been added with another device file.
1767.566 (II) config/udev: Adding input device HDA Intel PCH HDMI/DP,pcm=9 (/dev/input/event16)
1767.566 (II) No input driver specified, ignoring this device.
1767.566 (II) This device may have been added with another device file.
1767.567 (II) config/udev: Adding input device HDA Intel PCH HDMI/DP,pcm=10 (/dev/input/event17)
1767.567 (II) No input driver specified, ignoring this device.
1767.567 (II) This device may have been added with another device file.
1767.567 (II) config/udev: Adding input device HDA Digital PCBeep (/dev/input/event9)
1767.567 (II) No input driver specified, ignoring this device.
1767.567 (II) This device may have been added with another device file.
1767.567 (II) config/udev: Adding input device HDA Intel PCH Mic (/dev/input/event10)
1767.567 (II) No input driver specified, ignoring this device.
1767.567 (II) This device may have been added with another device file.
1767.567 (II) config/udev: Adding input device HDA Intel PCH Front Headphone (/dev/input/event12)
1767.567 (II) No input driver specified, ignoring this device.
1767.567 (II) This device may have been added with another device file.
1767.567 (II) config/udev: Adding input device PC Speaker (/dev/input/event3)
1767.567 (II) No input driver specified, ignoring this device.
1767.567 (II) This device may have been added with another device file.
1767.574 (--) NVIDIA(GPU-0): DFP-0: disconnected
1767.574 (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
1767.574 (--) NVIDIA(GPU-0): DFP-0: 330.0 MHz maximum pixel clock
1767.574 (--) NVIDIA(GPU-0):
HERE IS THE EXTERNAL MONITOR IT SHOULD USE:
1767.609 (--) NVIDIA(GPU-0): Philips PHL 328P6V (DFP-1): connected
1767.609 (--) NVIDIA(GPU-0): Philips PHL 328P6V (DFP-1): Internal TMDS
1767.609 (--) NVIDIA(GPU-0): Philips PHL 328P6V (DFP-1): 600.0 MHz maximum pixel clock
1767.609 (--) NVIDIA(GPU-0):
1767.609 (--) NVIDIA(GPU-0): DFP-2: disconnected
1767.609 (--) NVIDIA(GPU-0): DFP-2: Internal DisplayPort
1767.609 (--) NVIDIA(GPU-0): DFP-2: 1440.0 MHz maximum pixel clock
1767.609 (--) NVIDIA(GPU-0):
1767.609 (--) NVIDIA(GPU-0): DFP-3: disconnected
1767.609 (--) NVIDIA(GPU-0): DFP-3: Internal TMDS
1767.609 (--) NVIDIA(GPU-0): DFP-3: 165.0 MHz maximum pixel clock
1767.609 (--) NVIDIA(GPU-0):
1767.609 (--) NVIDIA(GPU-0): DFP-4: disconnected
1767.609 (--) NVIDIA(GPU-0): DFP-4: Internal DisplayPort
1767.609 (--) NVIDIA(GPU-0): DFP-4: 1440.0 MHz maximum pixel clock
1767.609 (--) NVIDIA(GPU-0):
1767.609 (--) NVIDIA(GPU-0): DFP-5: disconnected
1767.609 (--) NVIDIA(GPU-0): DFP-5: Internal TMDS
1767.609 (--) NVIDIA(GPU-0): DFP-5: 165.0 MHz maximum pixel clock
1767.609 (--) NVIDIA(GPU-0):
1767.610 (--) NVIDIA(GPU-0): DFP-6: disconnected
1767.610 (--) NVIDIA(GPU-0): DFP-6: Internal DisplayPort
1767.610 (--) NVIDIA(GPU-0): DFP-6: 1440.0 MHz maximum pixel clock
1767.610 (--) NVIDIA(GPU-0):
1767.610 (--) NVIDIA(GPU-0): DFP-7: disconnected
1767.610 (--) NVIDIA(GPU-0): DFP-7: Internal TMDS
1767.610 (--) NVIDIA(GPU-0): DFP-7: 165.0 MHz maximum pixel clock
1767.610 (--) NVIDIA(GPU-0):
Can somebody point me to the right direction?
Thanks!
Last edited by aquilarubra (2019-12-18 05:30:23)
Offline
That last excerpt looks like a working setup. By which means do you determine that "it still goes over the intel GPU"?. The entire point of all optimus based solutions is to still have the intel gpu active but do the heavy render lifting on the dedicated GPU. What's the output of
glxinfo -B
? I'm not entirely familiar with how the external GPU shtick is supposed to work, but have you tried not using any specific optimus configuration? Or does that only invoke on the external monitor?
Online
Here is the output. I determine that it goes to the intel GPU as the xterm I am launching opens in the same monitor I am using, connected to the Intel GPU. The external GPU stays black.
name of display: :1
display: :1 screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
Vendor: Intel Open Source Technology Center (0x8086)
Device: Mesa DRI Intel(R) HD Graphics (Coffeelake 3x8 GT3) (0x3ea5)
Version: 19.2.2
Accelerated: yes
Video memory: 3072MB
Unified memory: yes
Preferred profile: core (0x1)
Max core profile version: 4.5
Max compat profile version: 3.0
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.2
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics (Coffeelake 3x8 GT3)
OpenGL core profile version string: 4.5 (Core Profile) Mesa 19.2.2
OpenGL core profile shading language version string: 4.50
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL version string: 3.0 Mesa 19.2.2
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL ES profile version string: OpenGL ES 3.2 Mesa 19.2.2
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
display: :1 screen: 1
direct rendering: Yes
Memory info (GL_NVX_gpu_memory_info):
Dedicated video memory: 8192 MB
Total available memory: 8192 MB
Currently available dedicated video memory: 8077 MB
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 1080/PCIe/SSE2
OpenGL core profile version string: 4.6.0 NVIDIA 435.21
OpenGL core profile shading language version string: 4.60 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL version string: 4.6.0 NVIDIA 435.21
OpenGL shading language version string: 4.60 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 435.21
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
Offline
Actually, a second Xorg server is launched in the same Intel GPU, instead of the external nvidia GPU.
Offline
We see from that excerpt that you have a distinct xscreen. I very much doubt that any of the Optimus solutions work like they do on actual optimus systems and they are likely to be unnecessary on this setup.
I suggest you start from a more baseline config and ignore things in relation to optimus as you have different setup here, and look into this wiki and this recent thread on a similar requirement
Online
UPDATE:
with the following nvidia-xorg.conf, finally it worked.
Section "Files"
ModulePath "/usr/lib/nvidia"
ModulePath "/usr/lib32/nvidia"
ModulePath "/usr/lib32/nvidia/xorg/modules"
ModulePath "/usr/lib32/xorg/modules"
ModulePath "/usr/lib64/nvidia/xorg/modules"
ModulePath "/usr/lib64/nvidia/xorg"
ModulePath "/usr/lib64/xorg/modules"
EndSection
Section "ServerLayout"
Identifier "layout"
Screen 1 "intel"
Screen 0 "nvidia"
Inactive "intel"
EndSection
Section "Monitor"
Identifier "nvidia"
VendorName "Monitor Name"
ModelName "Monitor Model"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "PCI:7:0:0"
Option "Coolbits" "28"
Option "AllowExternalGpus"
Option "ConnectedMonitor" "DFP-1"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
Option "AllowEmptyInitialConfiguration"
Monitor "nvidia"
EndSection
Section "Monitor"
Identifier "intel"
VendorName "Monitor Name"
ModelName "Monitor Model"
EndSection
Section "Device"
Identifier "intel"
Driver "modesetting"
BusID "PCI:0:2:0"
Option "Monitor-eDP-1-1" "intel"
EndSection
Section "Screen"
Identifier "intel"
Device "intel"
Monitor "intel"
EndSection
Basically, I just swapped the screens, as it takes by default Screen 0:
Screen 1 "intel"
Screen 0 "nvidia"
Now, I know that one should be able to use zn Xorg server with the intel GPU, and another Xorg server with the nvidia GPU. However, when I launch the nvidia-xrun command on the nvidia GPU, the Intel card will show a static image, but I will see everything coming in the nvidia GPU, and I can work on the external GPU.
However, if I switch back console and go back to the first Xorg server with Ctrl-Alt-F1, the external GPU goes black and is disconnected. If I go the the console of the second Xorg server, Ctrl+Alt+F2, then after a little while the image comes back in the external monitor.
Any idea how to make them both live together?
Offline