You are not logged in.

#1 2017-05-14 19:11:18

wnayes
Member
Registered: 2014-01-17
Posts: 3

Multihead on Dell XPS 15 L502X with NVIDIA Optimus

In the days of xorg-server 1.17 - 1.18, I was able to output to two external monitors from a Dell XPS 15 L502X. This has always been tricky because the laptop has an NVIDIA Optimus setup, one that I'm pretty sure is wired to the NVIDIA chip.

Around the xorg-server 1.19 release, or just generally for the past few months, this has no longer been working.

The setup I currently have, and the one that used to work, includes using nvidia, xf86-video-intel, and bumblebee. I had originally documented this on the wiki here. The overall strategy was to:

  • Start a pretty vanilla configuration of X, display :0

  • Run optirun true to start a second X server with bumblebee, display :8

  • Run intel-virtual-output, which would create a VIRTUAL2 entry seen in xrandr. From there, I could display using both HDMI1 and VIRTUAL2

The problem I see now is that running intel-virtual-output causes the bumblebee X server to crash. In the Xorg.8.log, the following entries occur right after intel-virtual-output is called:

[   279.816] (--) NVIDIA(GPU-0): CRT-0: 400.0 MHz maximum pixel clock
[   279.816] (--) NVIDIA(GPU-0): 
[   279.816] (--) NVIDIA(GPU-0): DFP-0: connected
[   279.816] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
[   279.816] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
[   279.816] (--) NVIDIA(GPU-0): 
[   280.092] (II) NVIDIA(0): Setting mode "VGA-0: nvidia-auto-select @1024x768 +0+0 {ViewPortIn=1024x768, ViewPortOut=1024x768+0+0}, HDMI-0: nvidia-auto-select @1024x768 +1024+0 {ViewPortIn=1024x768, ViewPortOut=1024x768+0+0}"
[   280.285] (II) NVIDIA(0): Setting mode "HDMI-0: nvidia-auto-select @1024x768 +1024+0 {ViewPortIn=1024x768, ViewPortOut=1024x768+0+0}"
[   280.436] (II) NVIDIA(0): Setting mode "HDMI-0: nvidia-auto-select @1024x768 +1024+0 {ViewPortIn=1024x768, ViewPortOut=1024x768+0+0}"
[   280.471] (EE) 
[   280.471] (EE) Backtrace:
[   280.513] (EE) 0: /usr/lib/xorg-server/Xorg (OsLookupColor+0x139) [0x59c209]
[   280.513] (EE) 1: /usr/lib/libpthread.so.0 (funlockfile+0x50) [0x7feba43b502f]
[   280.520] (EE) 2: /usr/lib/xorg-server/Xorg (xf86nameCompare+0x69) [0x4a68b9]
[   280.521] (EE) 3: /usr/lib/xorg/modules/drivers/nvidia_drv.so (nvidiaAddDrawableHandler+0x2020c) [0x7feb9e36daec]
[   280.521] (EE) 4: /usr/lib/xorg/modules/drivers/nvidia_drv.so (nvidiaAddDrawableHandler+0x176d9) [0x7feb9e35bb09]
[   280.521] (EE) 5: /usr/lib/xorg/modules/drivers/nvidia_drv.so (nvidiaAddDrawableHandler+0x19f0b) [0x7feb9e36156b]
[   280.521] (EE) 6: /usr/lib/xorg/modules/drivers/nvidia_drv.so (nvidiaAddDrawableHandler+0x1b6f0) [0x7feb9e364130]
[   280.521] (EE) 7: /usr/lib/xorg/modules/drivers/nvidia_drv.so (nvidiaAddDrawableHandler+0x1be59) [0x7feb9e364fe9]
[   280.521] (EE) 8: /usr/lib/libnvidia-glcore.so.381.22 (nvidiaAddDrawableHandler+0x56d063) [0x7feb9ee07846]
[   280.521] (EE) 
[   280.521] (EE) Segmentation fault at address 0x0
[   280.521] (EE) 
Fatal server error:
[   280.521] (EE) Caught signal 11 (Segmentation fault). Server aborting

I've been try all sorts of different settings without luck (UseDisplayDevices=none, VirtualHeads=1, AllowEmptyInitialConfiguration).

Does anyone have any suggestions? Or if this seems to be a bug, does the stack trace suggest a particular package that is at fault? The segfault seems to happen in nvidia code, but the intel-virtual-output is clearly instigating it.

Offline

#2 2017-05-14 22:01:56

Nando_210
Member
From: TX, USA
Registered: 2015-05-04
Posts: 29

Re: Multihead on Dell XPS 15 L502X with NVIDIA Optimus

What happens if you run:

optirun true && intel-virtual-output && xrandr --output LVDS1 --auto --pos 0x0  --output VIRTUAL[X] --mode VIRTUAL[X].643-1920x1080 --$DIRECTION-of LVDS1

Modify cmd to your parameters.

Also for comparison for here is my xorg.conf.nvidia

# Configuration file for Bumblebee. Values should **not** be put between quotes

## Server options. Any change made in this section will need a server restart
# to take effect.
[bumblebeed]
# The secondary Xorg server DISPLAY number
VirtualDisplay=:8
# Should the unused Xorg server be kept running? Set this to true if waiting
# for X to be ready is too long and don't need power management at all.
KeepUnusedXServer=false
# The name of the Bumbleblee server group name (GID name)
ServerGroup=bumblebee
# Card power state at exit. Set to false if the card shoud be ON when Bumblebee
# server exits.
TurnCardOffAtExit=false
# The default behavior of '-f' option on optirun. If set to "true", '-f' will
# be ignored.
NoEcoModeOverride=false
# The Driver used by Bumblebee server. If this value is not set (or empty),
# auto-detection is performed. The available drivers are nvidia and nouveau
# (See also the driver-specific sections below)
Driver=nvidia
# Directory with a dummy config file to pass as a -configdir to secondary X
# XorgConfDir=/etc/bumblebee/xorg.conf.d

## Client options. Will take effect on the next optirun executed.
[optirun]
# Acceleration/ rendering bridge, possible values are auto, virtualgl and
# primus.
Bridge=auto
# The method used for VirtualGL to transport frames between X servers.
# Possible values are proxy, jpeg, rgb, xv and yuv.
VGLTransport=proxy
# Read back method
VGL_READBACK=pbo
# List of paths which are searched for the primus libGL.so.1 when using
# the primus bridge
PrimusLibraryPath=/usr/lib/primus:/usr/lib32/primus
# Should the program run under optirun even if Bumblebee server or nvidia card
# is not available?
AllowFallbackToIGC=false

# Driver-specific settings are grouped under [driver-NAME]. The sections are
# parsed if the Driver setting in [bumblebeed] is set to NAME (or if auto-
# detection resolves to NAME).
# PMMethod: method to use for saving power by disabling the nvidia card, valid
# values are: auto - automatically detect which PM method to use
#         bbswitch - new in BB 3, recommended if available
#       switcheroo - vga_switcheroo method, use at your own risk
#             none - disable PM completely
# [url]https://github.com/Bumblebee-Project/Bumblebee/wiki/Comparison-of-PM-methods[/url]

## Section with nvidia driver specific options, only parsed if Driver=nvidia
[driver-nvidia]
# Module name to load, defaults to Driver if empty or unset
KernelDriver=nvidia
PMMethod=bbswitch
# colon-separated path to the nvidia libraries
LibraryPath=/usr/lib/nvidia:/usr/lib32/nvidia:/usr/lib:/usr/lib32
# comma-separated path of the directory containing nvidia_drv.so and the
# default Xorg modules path
XorgModulePath=/usr/lib/nvidia/xorg/,/usr/lib/xorg/modules
XorgConfFile=/etc/bumblebee/xorg.conf.nvidia

## Section with nouveau driver specific options, only parsed if Driver=nouveau
[driver-nouveau]
KernelDriver=nouveau
PMMethod=none
XorgConfFile=/etc/bumblebee/xorg.conf.nouveau

Last edited by Nando_210 (2017-05-14 22:06:54)


---
Nando_210

Offline

#3 2017-05-16 02:15:43

wnayes
Member
Registered: 2014-01-17
Posts: 3

Re: Multihead on Dell XPS 15 L502X with NVIDIA Optimus

I tried a couple of the configuration options above that are different than mine (VGL_READBACK, PMMethod) but the problem is the same. Since intel-virtual-output causes the second X server (spawned from optirun) to crash, there are no virtual outputs available to configure.

I might give nouveau a try, since there's a chance it won't seg fault... or maybe I just need a new computer smile

Edit:

I was able to get multihead working, but it is somewhat convoluted. I did install and configure X to use nouveau, and then I start X and run Bumblebee (optirun true). However, at this point X then crashes, but Bumblebee does not. Then, I start X without a configuration file, and it seems to have two modesetting providers (xrandr --listproviders) and I can then configure the use of both external monitors with xrandr.

Last edited by wnayes (2017-09-11 23:32:06)

Offline

Board footer

Powered by FluxBB