You are not logged in.
Hi everyone.
This is my first post here. I already have some experience with linux but I find this one quite challenging for me and I would appreciate any help.
So I've got lenovo p50 laptop which has:
Intel HD Graphics 530
Nvidia Quadro M1000M
Built in 4K (3840x2160) screen driven by Intel
Two external HD (1920x1080) screens driven by Nvidia (all external video ports are hardwired to discrete card)
I'm running:
Kernel 4.8.11
"intel" video driver (not modesetting)
"nvidia" video driver (not nouveau)
xorg (wayland disabled), bumblebee, gnome, gdm
What does work (when no external monitors are connected):
2d/3d acceleration running fine on intel
built in screen works at native resolution
bumblebee is running perfectly rendering 3d graphics using nvidia and transferring to intel for displaying (without compression)
system is nice and stable
Now what happens when I plug in external monitors...
Because my ports are wired to nvidia I have to run two x servers and I'm using:
intel-virtual-output -f
Two external monitors successfully start up and are working just fine.
But !! built in display drops resolution from 4K to 1080p (matching external monitors) and stays there.
Native resolution (3840x2160) still visible in settings screen but if I switch built in display resolution back to 4K system completely freezes. (Visually when it's frozen I'm seeing collage of all 3 screens on my built in screen).
Another issue is when I'm done with the external monitors and interrupt intel-virtual-output (by CTRL+C) my built in screen stays at low resolution and if I dare to change it back it freezes again. So I need to restart every time I disconnect external monitors.
I think I followed pretty much all wiki articles I found and already got quite far but this last bit driving me crazy!
Can I somehow make my configuration work and keep native 4K resolution on my built in display and native 1080p on two external monitors same time?
Oh yes couple of config changes I've made:
/etc/bumblebee/xorg.conf.nvidia:
Option "AutoAddDevices" "true"
...
Option "UseEDID" "true"
#Option "UseDisplayDevice" "none" (commented out)
Thanks!
Offline
Do you really need the dGPU or would you just like to get the external screens working? In that case, consider using PRIME output slaving instead of Bumblebee.
https://wiki.archlinux.org/index.php/PRIME
I am using it with nouveau, see
https://github.com/Bumblebee-Project/Bu … -252312116
Offline