You are not logged in.
Pages: 1
Hi, I'm trying to get my laptop running with bumblebee. I have:
- AMD CPU with eGPU
- dedicated GPU RTX 3050
❯ lspci | grep VGA
01:00.0 VGA compatible controller: NVIDIA Corporation GA107M [GeForce RTX 3050 Mobile] (rev a1)
06:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Cezanne [Radeon Vega Series / Radeon Vega Mobile Series] (rev c6)
❯ sudo systemctl status bumblebeed
● bumblebeed.service - Bumblebee C Daemon
Loaded: loaded (/usr/lib/systemd/system/bumblebeed.service; enabled; preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Tue 2024-01-23 21:46:19 CET; 21s ago
Process: 11927 ExecStart=/usr/bin/bumblebeed (code=exited, status=1/FAILURE)
Main PID: 11927 (code=exited, status=1/FAILURE)
CPU: 2ms
❯ sudo bumblebeed
[ 2223.596014] [ERROR]No integrated video card found, quitting.
I have nvidia-open installed, and (I think) all the required drivers installed.
Is this even possible to run bumblebee with AMD eGPU + NVIDIA GPU?
I have an HDMI port on my laptop, that is wired to the NVIDIA GPU and, a display port that is wired to the CPU and I want to use both, since I have two external displays. I don't do gaming, so I don't care if the tool I'd use would underperform, I just need the two displays (prefferably under Wayland)
Last edited by dywersja (2024-01-23 20:49:46)
Offline
You don't want to use bumblebee on a system this new, ever, remove it.
But also generally speaking you probably don't want to use nvidia-open, it's still considered beta. If you want wayland support you need to enable KMS on the nvidia card https://wiki.archlinux.org/title/NVIDIA … de_setting (only the modeset parameter, don't setup fbdev).
Other than that you shouldn't need to do anything and this should just work.
Offline
Okay, I did everything what you suggested, but the display wired to the GPU is 4K 60Hz and every other display works smoothly except for this one. It feels like 15Hz ;C
Last edited by dywersja (2024-01-25 10:57:32)
Offline
Which "wayland"? Which compositor? Generally speaking all of this is still quite in flux and e.g. on KDE there have been big improvements to this usecase in Plasma 6 which would mandate you wait 1 month for that to get released or dabble in testing and kde-unstable land. There have also been a bunch of fixes for this in the current nvidia-beta driver, which might be worth a look. https://aur.archlinux.org/packages/nvidia-beta-dkms
Offline
Resp: wayland at all? Do you get this (as well) w/ X11?
In that case please post your Xorg log, https://wiki.archlinux.org/title/Xorg#General
Online
Which "wayland"? Which compositor? Generally speaking all of this is still quite in flux and e.g. on KDE there have been big improvements to this usecase in Plasma 6 which would mandate you wait 1 month for that to get released or dabble in testing and kde-unstable land. There have also been a bunch of fixes for this in the current nvidia-beta driver, which might be worth a look. https://aur.archlinux.org/packages/nvidia-beta-dkms
I use KDE with Wayland and SDDM, since GDM requires some additional steps, and I must have done something wrong during the GDM setup, because it didn't want to launch anything under Wayland..
When I change the resolution from 4K to 1080p it works great. It doesn't drop any frames. Anything above 1080p gets worse the higher the resolution.
This is persistent across all nvidia(open/beta/beta-dkms) drivers. You might start to wonder if it's not my monitor or cable that is faulty, but it isn't since everything works well on Windows with 3 displays.
Resp: wayland at all? Do you get this (as well) w/ X11?
In that case please post your Xorg log, https://wiki.archlinux.org/title/Xorg#General
If I switch from Wayland to X11, the experience is quite buggy (I'm testing on nvidia-beta-dkms), but there aren't any dropped frames - stable 4K60. I can't say, that it's perfect, because there is some tearing... unless I disable all the other monitors. Then the experience is fantastic, but I can't use other monitors.
Here are the xorg logs: https://pastebin.com/7pBYXf1T
it might be important to note, that I have not generated xorg.conf with nvidia-xconfig. When I generate it, it doesn't recognize any other monitors, except for the one that goes to the gpu.
Offline
Your experience isn't "buggy" that's how xorg works. You should be able to configure the nvidia screen as primary so it gets synced to it's refresh rate, but all your other screens might "act up" in that case.
FWIW from the xorg log, try getting rid of xf86-video-amdgpu and check how it behaves then. That nvidia-xconfig breaks this is normal and expected, don't use it.
Offline
...You should be able to configure the nvidia screen as primary ...
By this you mean simply in the display settings of the desktop environment? It's been set as the primary the whole time.
Offline
[ 14.049] (II) AMDGPU(0): Output eDP connected
[ 14.049] (II) AMDGPU(0): Output DisplayPort-0 connected
[ 14.123] (--) NVIDIA(GPU-0): Philips PHL 288E2 (DFP-0): connected
There're three outputs, one on the nvidia, the others on the AMD GPU
You're running a regular prime setup on the AMD chip w/ the nvidia GPU for prime offlading and reverse prime
[ 14.022] (==) AMDGPU(0): TearFree property default: auto
https://wiki.archlinux.org/title/AMDGPU … _rendering
And then you might need https://wiki.archlinux.org/title/PRIME# … ronization
Online
Pages: 1