You are not logged in.

#1 2017-05-28 10:50:34

SmallAndSimple
Member
Registered: 2015-11-25
Posts: 50

subtle performance issues using bumblebee and intel-virtual-output

I have a HP Zbook 15 containing both an Intel internal graphics card connected to VGA and the laptop screen and a Nvidia card, connected to two displayports.

I run bumblebee and intel-virtual-output to use a displayport of my laptop. And its working, stricktly speaking. I have two main issues:

1. On the monitor connected to the displayport, the mouse is slightly lagging. I tried to fix this by changing the file /etc/bumblebee/bumblebee.conf:

from

VGLTransport=proxy

to

VGLTransport=rgb
VGL_READBACK=pbo

But this did not change anything.

2. The second Xorg server, the one powering the displayport always consumes 20%-25% CPU. This slight hogging prevents my cpu from really dropping its speed and thus causes more heat and power consumption.
I am afraid at this point that this is a feature of my setup, but maybe you guys know how to fix this.

For reference:
xorg.conf

Section "ServerLayout"
    Identifier "Layout0"
EndSection

Section "Device"
    Identifier "Device1"
    Driver "nvidia"
    VendorName "NVIDIA Corporation"
    Option "NoLogo" "true"
    Option "ConnectedMonitor" "DFP"
EndSection

#For the configuration with bumblebee installed
#Section that follows come from archlinux adapt it from your distro if necessary.
Section "Files"
  ModulePath   "/usr/lib/nvidia/xorg/"
  ModulePath   "/usr/lib/xorg/modules/"
EndSection

/etc/bumblebee/bumblebee.conf

# Configuration file for Bumblebee. Values should **not** be put between quotes

## Server options. Any change made in this section will need a server restart
# to take effect.
[bumblebeed]
# The secondary Xorg server DISPLAY number
VirtualDisplay=:8
# Should the unused Xorg server be kept running? Set this to true if waiting
# for X to be ready is too long and don't need power management at all.
KeepUnusedXServer=true
# The name of the Bumbleblee server group name (GID name)
ServerGroup=bumblebee
# Card power state at exit. Set to false if the card shoud be ON when Bumblebee
# server exits.
TurnCardOffAtExit=true
# The default behavior of '-f' option on optirun. If set to "true", '-f' will
# be ignored.
NoEcoModeOverride=false
# The Driver used by Bumblebee server. If this value is not set (or empty),
# auto-detection is performed. The available drivers are nvidia and nouveau
# (See also the driver-specific sections below)
Driver=nvidia
# Directory with a dummy config file to pass as a -configdir to secondary X
XorgConfDir=/etc/bumblebee/xorg.conf.d

## Client options. Will take effect on the next optirun executed.
[optirun]
# Acceleration/ rendering bridge, possible values are auto, virtualgl and
# primus.
Bridge=auto
# The method used for VirtualGL to transport frames between X servers.
# Possible values are proxy, jpeg, rgb, xv and yuv.
VGLTransport=proxy
# List of paths which are searched for the primus libGL.so.1 when using
# the primus bridge
PrimusLibraryPath=/usr/lib/primus:/usr/lib32/primus
# Should the program run under optirun even if Bumblebee server or nvidia card
# is not available?
AllowFallbackToIGC=false
# Specfies the method used by virtualGL to read back the 3D pixels from the 3D Graphics hardware
VGL_READBACK=pbo


# Driver-specific settings are grouped under [driver-NAME]. The sections are
# parsed if the Driver setting in [bumblebeed] is set to NAME (or if auto-
# detection resolves to NAME).
# PMMethod: method to use for saving power by disabling the nvidia card, valid
# values are: auto - automatically detect which PM method to use
#         bbswitch - new in BB 3, recommended if available
#       switcheroo - vga_switcheroo method, use at your own risk
#             none - disable PM completely
# https://github.com/Bumblebee-Project/Bumblebee/wiki/Comparison-of-PM-methods

## Section with nvidia driver specific options, only parsed if Driver=nvidia
[driver-nvidia]
# Module name to load, defaults to Driver if empty or unset
KernelDriver=nvidia
PMMethod=none
# colon-separated path to the nvidia libraries
LibraryPath=/usr/lib/nvidia:/usr/lib32/nvidia:/usr/lib:/usr/lib32:/usr/lib/nvidia/xorg
# comma-separated path of the directory containing nvidia_drv.so and the
# default Xorg modules path
XorgModulePath=/usr/lib/nvidia/xorg,/usr/lib/xorg/modules,/usr/lib/xorg/modules/drivers
XorgConfFile=/etc/bumblebee/xorg.conf.nvidia

## Section with nouveau driver specific options, only parsed if Driver=nouveau
[driver-nouveau]
KernelDriver=nouveau
PMMethod=none
XorgConfFile=/etc/bumblebee/xorg.conf.nouveau

/etc/bumblebee/xorg.conf.nvidia

Section "ServerLayout"
    Identifier  "Layout0"
    Option      "AutoAddDevices" "false"
    Option      "AutoAddGPU" "false"
EndSection

Section "Device"
    Identifier  "DiscreteNvidia"
    Driver      "nvidia"
    VendorName  "NVIDIA Corporation"

#   If the X server does not automatically detect your VGA device,
#   you can manually set it here.
#   To get the BusID prop, run `lspci | egrep 'VGA|3D'` and input the data
#   as you see in the commented example.
#   This Setting may be needed in some platforms with more than one
#   nvidia card, which may confuse the proprietary driver (e.g.,
#   trying to take ownership of the wrong device). Also needed on Ubuntu 13.04.
#   BusID "PCI:01:00:0"

#   Setting ProbeAllGpus to false prevents the new proprietary driver
#   instance spawned to try to control the integrated graphics card,
#   which is already being managed outside bumblebee.
#   This option doesn't hurt and it is required on platforms running
#   more than one nvidia graphics card with the proprietary driver.
#   (E.g. Macbook Pro pre-2010 with nVidia 9400M + 9600M GT).
#   If this option is not set, the new Xorg may blacken the screen and
#   render it unusable (unless you have some way to run killall Xorg).
    Option "ProbeAllGpus" "false"

    Option "NoLogo" "true"
#    Option "UseEDID" "false"
#    Option "UseDisplayDevice" "none"
    Option "AllowEmptyInitialConfiguration"
EndSection

Section "Screen"
   Identifier "Screen0"
   Device "DiscreteNVidia"
EndSection

Last edited by SmallAndSimple (2017-05-28 10:52:24)

Offline

#2 2017-10-17 10:15:02

dam
Member
Registered: 2015-04-03
Posts: 2

Re: subtle performance issues using bumblebee and intel-virtual-output

2. The second Xorg server, the one powering the displayport always consumes 20%-25% CPU. This slight hogging prevents my cpu from really dropping its speed and thus causes more heat and power consumption.
I am afraid at this point that this is a feature of my setup, but maybe you guys know how to fix this.

Hi, I have the same problem with CPU usage (40-50%) by the second Xorg server on Dell E6530.

00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09)
01:00.0 VGA compatible controller: NVIDIA Corporation GF108GLM [NVS 5200M] (rev a1)
  PID UŻYTK.   PR  NI    WIRT    REZ  %CPU %PAM     CZAS+ S KOMENDA                                                                                                                                                                          
21699 root      20   0  263,9m  63,6m  44,4  0,5 194:15.98 R Xorg  

@SmallAndSimple, did you solve this problem?

Offline

Board footer

Powered by FluxBB