You are not logged in.
I am trying to set up a desktop set-up while using my laptop running Arch Linux with dwm. The plan is to be able to connect all peripherals, close the lid, and use it like a PC. I was happy while looking at the output of external monitors, however, if I set an external monitor as primary and turn off the internal laptop monitor with xrandr, suddenly the external monitor begins to lag. A lot. Notably, dwm itself works just fine, but every application within it begins to run at 1 fps and becomes unusable, even if I restart it. Does anyone know what might be the cause of my issues?
===SOLUTION===
1. Install optimus-manager
2. Switch to Nvidia
Using Nvidia as primary fixes all of the laggy problems with the external monitor and removes the weird Xorg CPU use when an external monitor is plugged in
Last edited by DFOwl (2021-10-27 20:15:34)
Offline
Optimus? See the note:
https://wiki.archlinux.org/title/PRIME#Reverse_PRIME
Offline
- Currently when only external display is enabled, you will only get 1 FPS. See [3] for more details.
Oh. This is a big bummer. They seem to be linking to a web page on which they describe a similar issue and provide an XOrg patch.
EDIT:
I have no idea how to apply the patch in order to fix this problem.
Last edited by DFOwl (2021-10-22 13:36:04)
Offline
Did you see and try https://gitlab.freedesktop.org/xorg/xse … te_1067790 ?
Otherwise see https://wiki.archlinux.org/title/Arch_Build_System & https://wiki.archlinux.org/title/Patching_packages
Offline
I have seen the first one, actually. I didn't apply it, so I will post an update once I apply
LIBGL_DRI3_DISABLE=true
.
Offline
Alright I have edited /etc/environment and have plugged in the monitor. I disabled the screen. It appears that the lag has went away! Mostly. It doesn't lag anymore at 1 fps as it did before, however, I am experiencing an issue with a very slight lag and an entire thread of CPU being used while being plugged into the external monitor. However, I am glad that at the very least this lag is gone. I think that about the other problems I should create a different thread. If you are wondering what I am talking about, it's this: https://forums.developer.nvidia.com/t/h … -in/169173
Offline
The monitor gets detected over and over again, please check/replace cable.
True?
=> Try to use
Option "UseHotplugEvents" "False"
Offline
The monitor gets detected over and over again, please check/replace cable.
True?
=> Try to use
Option "UseHotplugEvents" "False"
Where do I put that option?
Offline
https://wiki.archlinux.org/title/Xorg#Using_.conf_files
/etc/X11/xorg.conf.d/20-nvidia.conf
Section "Device"
Identifier "Default nvidia Device"
Driver "nvidia"
Option "CoolBits" "24"
Option "TripleBuffer" "True"
Option "UseHotplugEvents" "False"
EndSection
Offline
https://wiki.archlinux.org/title/Xorg#Using_.conf_files
/etc/X11/xorg.conf.d/20-nvidia.confSection "Device" Identifier "Default nvidia Device" Driver "nvidia" Option "CoolBits" "24" Option "TripleBuffer" "True" Option "UseHotplugEvents" "False" EndSection
When I create the file within the quotes that you sent, the system fails to boot into X11. Just to test whether the Options might be causing it, I commented all of the Option parts out. Still, it fails to boot completely, so this appears to be a problem with the Driver section or mayhaps even the file itself. Removing the entire file allows the system to boot again.
EDIT:
My a laptop has two GPUs, an intel one and an Nvidia one. Despite reading the wiki I remain somewhat confused as to how configure (or if manual configuration is required at all) these graphics cards.
Last edited by DFOwl (2021-10-27 17:26:51)
Offline
Post the file you actually created as well as the log of the failing xorg server, https://wiki.archlinux.org/title/Xorg#General
Offline
Ok here:
https://pastebin.com/76bE8ZGw -- Nvidia config, direct copy of what you pasted
https://pastebin.com/BHZN22BG -- the Xorg log
Although, I have to note. I have no idea whether it "failed" or not. All I know is that it got stuck on the systemd screen and nothing would help it.
Offline
Server starts but there's initially no output attached (and later attachments are ignored)
Let's come back to
The monitor gets detected over and over again, please check/replace cable.
True?
Please post an xorg log from a running server (after a couple of minutes) see we can see whether and which output gets redetected constanly.
Offline
Oh perhaps I should note that when I tried to boot with the file present, I was not plugging in my laptop into any external monitor. What puzzles me is why did systemd get stuck on activating graphics? Is there a way to fix that?
Please post an xorg log from a running server (after a couple of minutes) see we can see whether and which output gets redetected constanly.
Sure.
Offline
It didn't - the graphical target starts, but the nvidia driver has nothing to output so it'll require the prime offload and that's spoiled by the config file.
You'd need a more complete config like https://wiki.archlinux.org/title/NVIDIA … phics_only and inject the option there.
Offline
Oh, quick news. So with the file present and with all of the options inside it I decided to boot again but this time I plugged in the HDMI cable. And what do you know, the built-in laptop screen remains stuck on the systemd screen while the external monitor boots with absolutely no problem. Browsing on the external monitor is now super smooth and there is no extra CPU use from Xorg. While doing this, I also checked the KDE System Settings > About this System, and instead of saying that it is using the Mesa Driver for an Intel in-built graphics card, it now says that it is using NVIDIA. In conclusion, it appears that the fix basically turns off the dedicated Intel graphics card and replaces it with NVIDIA as the primary output; However, this breaks the internal screen because it seems to be relying on the Intel graphics.
Offline
Is there a possibility to be able to turn on/off an X11 configuration on the fly? Because this method is perfect for docking. The biggest downside with this fix is that I would have to go edit system files and then reboot before docking, which is a hassle. Or maybe there is a way for me to load nvidia as primary and intel as secondary so I can still use my in-built screen when I am not docked?
I am not quite sure how any of this works so sorry if my suggestions appear to be ignorant.
Offline
https://wiki.archlinux.org/title/NVIDIA_Optimus
optimus-manger will do that, but requires to restart the X11 server.
You could still try whether https://wiki.archlinux.org/title/PRIME#Reverse_PRIME along the hotplug options works as well.
Offline
I have just tried reverse-prime and there were absolutely no behaviour changes, so that appears to not be working.
Offline
I installed *optimus-manager* and played around with it. When going into discrete graphics mode and hybrid graphics mode, when I plug in the external monitor, same thing happens as always. However, when I switch to nvidia graphics, I have no problem with the external monitor whatsoever. Are there any downsides to using the nvidia card besides power drain? This seems like an incomplete fix, but a fix nontheless.
EDIT:
Of course, it would be nice if there was a way to configure the "hybrid" prime mode. So if anyone knows how to do this, please add to the thread
Last edited by DFOwl (2021-10-27 19:46:52)
Offline
Are there any downsides to using the nvidia card besides power drain?
Hardly. You've the nvidia GPU powered on anyway when using the external output (since it needs to act as a crtc hop for the intel chip)
For now, sidestepping reverse prime seems the only way to avoid the bug.
You did add the
Option "UseHotplugEvents" "False"
to the reverse prime configlet, did you? And along LIBGL_DRI3_DISABLE=true the server still hogged an entire core?
Offline
Oh I added the Option to the prime config, but I did not enable LIBGL_DRI3_DISABLE, though. However, I think I am mostly satisfied with the fix of switching over to the NVIDIA card (besides certain nvidia graphics bugs, but I am sure those can be fixed with the links you've sent in other threads).
I can try to do the reverse prime config plus the DRI3 Disable, but I am not sure if it conflicts with the config file generated by optimus. It seems that this problem primarily appears in hybrid graphics mode, so it must be something to do with the way it handles them.
Offline