You are not logged in.
Hi
I have a few problems with my hybrid video cards (Intel HD4000 and Nvidia 630M). I followed the steps in this wiki https://wiki.archlinux.org/index.php/Bumblebee and I tried to install all with: pacman -S nvidia nvidia-utils nvidia-libgl (all related with my intel video card is in my system). Also I have installed primus-git, bbswitch-git and bumblebee-git.
But the first problem that I had when I installed the Nvidia driver is, it shows me a conflict between mesa-libgl and nvidia-libgl. I "solved" this (for now) running pacman -Sdd nvidia-libgl but it removes mesa-libgl and without it I can't use my intel video card with GLX aceleration and I need it. How to avoid this conflict?
The second thing is my Nvidia card and bumblebee. After all, when I run optirun glxgears -info I get this error:
Xlib: extension "GLX" missing on display ":0.0".
Xlib: extension "GLX" missing on display ":0.0".
Error: couldn't get an RGB, Double-buffered visual
Same thing running primusrun.
My bumblebee.conf is this:
# Configuration file for Bumblebee. Values should **not** be put between quotes
## Server options. Any change made in this section will need a server restart
# to take effect.
[bumblebeed]
# The secondary Xorg server DISPLAY number
VirtualDisplay=:8
# Should the unused Xorg server be kept running? Set this to true if waiting
# for X to be ready is too long and don't need power management at all.
KeepUnusedXServer=false
# The name of the Bumbleblee server group name (GID name)
ServerGroup=bumblebee
# Card power state at exit. Set to false if the card shoud be ON when Bumblebee
# server exits.
TurnCardOffAtExit=false
# The default behavior of '-f' option on optirun. If set to "true", '-f' will
# be ignored.
NoEcoModeOverride=false
# The Driver used by Bumblebee server. If this value is not set (or empty),
# auto-detection is performed. The available drivers are nvidia and nouveau
# (See also the driver-specific sections below)
Driver=nvidia
## Client options. Will take effect on the next optirun executed.
[optirun]
# Acceleration/ rendering bridge, possible values are auto, virtualgl and
# primus.
Bridge=auto
# The method used for VirtualGL to transport frames between X servers.
# Possible values are proxy, jpeg, rgb, xv and yuv.
VGLTransport=proxy
# Should the program run under optirun even if Bumblebee server or nvidia card
# is not available?
AllowFallbackToIGC=false
# Driver-specific settings are grouped under [driver-NAME]. The sections are
# parsed if the Driver setting in [bumblebeed] is set to NAME (or if auto-
# detection resolves to NAME).
# PMMethod: method to use for saving power by disabling the nvidia card, valid
# values are: auto - automatically detect which PM method to use
# bbswitch - new in BB 3, recommended if available
# switcheroo - vga_switcheroo method, use at your own risk
# none - disable PM completely
# https://github.com/Bumblebee-Project/Bumblebee/wiki/Comparison-of-PM-methods
## Section with nvidia driver specific options, only parsed if Driver=nvidia
[driver-nvidia]
# Module name to load, defaults to Driver if empty or unset
KernelDriver=nvidia
PMMethod=auto
# colon-separated path to the nvidia libraries
LibraryPath=/usr/lib/nvidia:/usr/lib32/nvidia
# comma-separated path of the directory containing nvidia_drv.so and the
# default Xorg modules path
XorgModulePath=/usr/lib/nvidia/xorg/,/usr/lib/xorg/modules
XorgConfFile=/etc/bumblebee/xorg.conf.nvidia
## Section with nouveau driver specific options, only parsed if Driver=nouveau
[driver-nouveau]
KernelDriver=nouveau
PMMethod=auto
XorgConfFile=/etc/bumblebee/xorg.conf.nouveau
lspci | grep VGA:
00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09)
01:00.0 VGA compatible controller: NVIDIA Corporation GF108M [GeForce GT 630M] (rev ff)
My /etc/bumblebee/xorg.conf.nvidia is this:
Section "ServerLayout"
Identifier "Layout0"
Option "AutoAddDevices" "false"
EndSection
Section "Device"
Identifier "Device1"
Driver "nvidia"
VendorName "NVIDIA Corporation"
# If the X server does not automatically detect your VGA device,
# you can manually set it here.
# To get the BusID prop, run `lspci | grep VGA` and input the data
# as you see in the commented example.
# This Setting may be needed in some platforms with more than one
# nvidia card, which may confuse the proprietary driver (e.g.,
# trying to take ownership of the wrong device).
BusID "PCI:01:00:0"
# Setting ProbeAllGpus to false prevents the new proprietary driver
# instance spawned to try to control the integrated graphics card,
# which is already being managed outside bumblebee.
# This option doesn't hurt and it is required on platforms running
# more than one nvidia graphics card with the proprietary driver.
# (E.g. Macbook Pro pre-2010 with nVidia 9400M + 9600M GT).
# If this option is not set, the new Xorg may blacken the screen and
# render it unusable (unless you have some way to run killall Xorg).
Option "ProbeAllGpus" "false"
Option "NoLogo" "true"
Option "UseEDID" "false"
Option "UseDisplayDevice" "none"
Option "ConnectedMonitor" "DFP"
EndSection
ls /usr/lib/nvidia
libGL.so libGL.so.1 libGL.so.319.17 xorg
ls /usr/lib | grep libGL.so
libGL.so
libGL.so.1
libGL.so.319.17
Excuse me for my english but is not my native language.
English is not my native language. So please, tell me if I am writing it wrong.
Offline
But the first problem that I had when I installed the Nvidia driver is, it shows me a conflict between mesa-libgl and nvidia-libgl. I "solved" this (for now) running pacman -Sdd nvidia-libgl but it removes mesa-libgl and without it I can't use my intel video card with GLX aceleration and I need it. How to avoid this conflict?
There is no conflict. Install mesa-libgl and everything should work fine.
Offline
Thanks for the answer sano. I installed the mesa-libgl and lib32-mesa-libgl (I forgot to mention that my system is x86_x64) with the command pacman -Sdd mesa-libg lib32-mesa-libgl and that automatically removes the lib32-nvidia-libgl and nvidia-libgl without problem.
Now I run glxgears -info and shows me the GLX test and it gives me about 60FPS with the Intel. If I run primusrun glxgears -info it gives me the same 60FPS (that's extrange I think, should be more). Same thing for glxspheres and/or using optirun instead primusrun.
I ran some games, mostly from Steam and the diference between the Intel and the Nvidia is waaaay too long (in terms of performance). So, I think bumblebee is working fine and is using the Nvidia.
Problem Solved. Thanks.
English is not my native language. So please, tell me if I am writing it wrong.
Offline