You are not logged in.
I had update to nvidia-dkms 440.44.8 on Jan 2, 2020. After reboot, the external display attached to my laptop is not being detected in xrandr, or by display configuration settings in KDE and Gnome. When i open nvidia-settings, the external display name is shown in there, but no output.
Offline
For what it's worth I'm having exact same problem.
Offline
Similar Issue opened here: https://bbs.archlinux.org/viewtopic.php?pid=1881048
Offline
cd /var/cache/pacman/pkg
check for previus versions of nvidia-utils
sudo pacman -U nvidia-utils-440.44-1-x86_64.pkg.tar.xz to downgrade
refresh initframs
edit /etc/pacman.conf and add nvidia-utils to IgnorePkg until the issue is resolved.
Offline
cd /var/cache/pacman/pkg
check for previus versions of nvidia-utils
sudo pacman -U nvidia-utils-440.44-1-x86_64.pkg.tar.xz to downgrade
refresh initframs
edit /etc/pacman.conf and add nvidia-utils to IgnorePkg until the issue is resolved.
Already did this I have downgraded to nvidia 440.44.7, but this isn't the permanent fix.
Consider the case when a fresh install of arch is done by someone, A permanent fix is needed
Offline
This isn't a "bug" but a deliberate change in the configuration file, to fix the more modern nvidia PRIME render setup, with the side effect that it - seemingly - breaks the older offloading method. See: https://bugs.archlinux.org/task/64805
As I don't think it should, do you perhaps have incidentally installed xf86-video-intel? What happens if you remove that? Otherwise just restore the original contents of /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf into /etc/X11/xorg.conf.d
Offline
@V1del Nice find.
Here are my 2cents:
- Kept a copy of /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf before updating to the latest nvidia-utils.
The new copy contains the following changes:
- Deletes the OutputClass for modesetting. According to the linked bug report https://bugs.archlinux.org/task/64805 that's fine, as modesetting is the default.
- Removes the Option "PrimaryGPU" "yes" from the nvidia OutputClass.
I initially copied that file over to my /etc/X11/xord.conf.d/. This configuration didn't work for me, my screens were not identified on reboot.
I set just the PrimaryGPU option et voila, it worked!
I don't know much about PRIME render, but it seems to me that I don't use it, even though I have two graphics cards.
I don't have xf86-video-intel installed.
Offline
@V1del Nice find.
Here are my 2cents:
- Kept a copy of /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf before updating to the latest nvidia-utils.
The new copy contains the following changes:
- Deletes the OutputClass for modesetting. According to the linked bug report https://bugs.archlinux.org/task/64805 that's fine, as modesetting is the default.
- Removes the Option "PrimaryGPU" "yes" from the nvidia OutputClass.I initially copied that file over to my /etc/X11/xord.conf.d/. This configuration didn't work for me, my screens were not identified on reboot.
I set just the PrimaryGPU option et voila, it worked!I don't know much about PRIME render, but it seems to me that I don't use it, even though I have two graphics cards.
I don't have xf86-video-intel installed.
Can you please detail the fix step by step.
Thanks
Offline
Well, here's what worked for me:
- Installed latest nvidia-utils (important to do first, because it will update the file to be copied below)
- cp /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf /etc/X11/xorg.conf.d
- (for me, maybe not for other people) Edit /etc/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf and add the line 'Option "PrimaryGPU" "yes"' in it.
- sudo mkinitcpio -P (assuming that you've enabled the nvidia modules, as per the archlinux nvidia guide)
- reboot
Offline
Well, here's what worked for me:
- Installed latest nvidia-utils (important to do first, because it will update the file to be copied below)
- cp /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf /etc/X11/xorg.conf.d
- (for me, maybe not for other people) Edit /etc/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf and add the line 'Option "PrimaryGPU" "yes"' in it.
- sudo mkinitcpio -P (assuming that you've enabled the nvidia modules, as per the archlinux nvidia guide)
- reboot
Thanks, this fixed the issue.
'Option "PrimaryGPU" "yes"' in /etc/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf was required for me to fix it
Offline
I think this is large enough of a breaking change that it should be announced on Arch News.
Offline
I'm the author if the other (now closed) thread.
I have a discrete video card. Should I try these steps to fix the issue? Thank you very much!
Offline
Yes.
Offline
Thanks for the replies.
Doing 1)
cp /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf /etc/X11/xorg.conf.d
2)Adding this to /etc/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf:
Option "PrimaryGPU" "yes"
3)
mkinitcpio -P
did the tricik for me.
4) Reboot
But now I have 2 HDMIs outputs showing up, but laptop only has one!
$ xrandr
Screen 0: minimum 8 x 8, current 3840 x 1080, maximum 32767 x 32767
HDMI-0 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 698mm x 393mm
1920x1080 60.00*+ 59.94 50.00 50.00 60.05 60.00 50.04
1680x1050 59.95
1440x900 74.98 59.89
....
640x480 75.00 72.81 59.94 59.93
eDP-1-1 connected 1920x1080+1920+0 (normal left inverted right x axis y axis) 344mm x 194mm
1920x1080 60.02*+ 60.01 59.97 59.96 59.93 47.99
1680x1050 59.95 59.88
1400x1050 59.98
...
320x180 59.84 59.32
HDMI-1-1 disconnected (normal left inverted right x axis y axis)
1680x1050 (0x1e9) 146.250MHz -HSync +VSync
h: width 1680 start 1784 end 1960 total 2240 skew 0 clock 65.29KHz
v: height 1050 start 1053 end 1059 total 1089 clock 59.95Hz
...
HDMI-1-1 shouldn't exist. Any ideas ?
Last edited by skualos (2020-01-06 09:08:05)
Offline
Hello, people.
I'm having the exact same issue as you.
After doing `pacman -Syu` on 2020/01/04 my GTX 1070 stopped recognizing two monitors that it was attached to.
Note that I am using a desktop computer not a loptop.
Thanks for the replies.
Doing 1)cp /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf /etc/X11/xorg.conf.d
2)Adding this to /etc/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf:
Option "PrimaryGPU" "yes"
3)
mkinitcpio -P
did the tricik for me.
4) RebootBut now I have 2 HDMIs outputs showing up, but laptop only has one!
$ xrandr Screen 0: minimum 8 x 8, current 3840 x 1080, maximum 32767 x 32767 HDMI-0 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 698mm x 393mm 1920x1080 60.00*+ 59.94 50.00 50.00 60.05 60.00 50.04 1680x1050 59.95 1440x900 74.98 59.89 .... 640x480 75.00 72.81 59.94 59.93 eDP-1-1 connected 1920x1080+1920+0 (normal left inverted right x axis y axis) 344mm x 194mm 1920x1080 60.02*+ 60.01 59.97 59.96 59.93 47.99 1680x1050 59.95 59.88 1400x1050 59.98 ... 320x180 59.84 59.32 HDMI-1-1 disconnected (normal left inverted right x axis y axis) 1680x1050 (0x1e9) 146.250MHz -HSync +VSync h: width 1680 start 1784 end 1960 total 2240 skew 0 clock 65.29KHz v: height 1050 start 1053 end 1059 total 1089 clock 59.95Hz ...
HDMI-1-1 shouldn't exist. Any ideas ?
I've tried doing it your way. But it did not fix it for me.
Any ideas on where I should look on where it has gone wrong? I'm still not familiar with debugging in Linux in general.
However, using nvidia-utils-beta from AUR worked.
So for anyone that's having issue with the fix above, using the beta version would be a good temporary solution.
Offline
Well, here's what worked for me:
- Installed latest nvidia-utils (important to do first, because it will update the file to be copied below)
- cp /usr/share/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf /etc/X11/xorg.conf.d
- (for me, maybe not for other people) Edit /etc/X11/xorg.conf.d/10-nvidia-drm-outputclass.conf and add the line 'Option "PrimaryGPU" "yes"' in it.
- sudo mkinitcpio -P (assuming that you've enabled the nvidia modules, as per the archlinux nvidia guide)
- reboot
this worked perfect for me as well. i can't believe this hasn't been announced as a breaking change. i would think everyone with a laptop and a nvidia gpu would be affected....
Offline
Desktop computers should not be affected by this and aren't what this is about, don't hijack threads:
@Afader Make sure GDM is set to start xorg and not even attempt wayland on nvidia It's likely to be generally helpful.
FWIW I don't experience any of these issues, but I use KDE
I had this issue although I am on KDE.
I am still having this issue (eventhough it has nothing to do with the issue in this thread) with sddm login screen only being displayed on external display and not on laptop screen while using nvidia drivers. Just a black screen appears on laptop screen after boot if no external display is connected. Typing in the password blindly and clicking enter logs me in, but the dpi and scaling will be messed up.
Last edited by atomixhawk (2020-01-07 15:37:14)
Offline
Offline
I had this issue although I am on KDE.
No, you didn't. Afader faces a segfault in gnome-shell through libmutter.
What you describe seems far more like the original topic, just in reverse. You added the PriamryGPU key, the server now runs on the nvidia chip and there's no output sink configured to feed the intel IGP.
I don't know how you intend to run the system, but see https://wiki.archlinux.org/index.php/NV … phics_only
Edit: context because V1del ninja'd me…
Last edited by seth (2020-01-07 16:14:01)
Offline
Hello, people.
I'm having the exact same issue as you.After doing `pacman -Syu` on 2020/01/04 my GTX 1070 stopped recognizing two monitors that it was attached to.
Note that I am using a desktop computer not a loptop.I've tried doing it your way. But it did not fix it for me.
Any ideas on where I should look on where it has gone wrong? I'm still not familiar with debugging in Linux in general.However, using nvidia-utils-beta from AUR worked.
So for anyone that's having issue with the fix above, using the beta version would be a good temporary solution.
Sorry, but I have no idea. I just followed what I read here
Offline
There's nothing the beta packages does differently for desktops, the fix provided here must work for you, or you didn't apply it correctly.
Offline
The PrimaryGPU option was removed because of several reasons:
1) According to the xorg.conf man page, files on /usr/share/X11 are looked last and the configurations there are *merged* into any other.
2) That prevented xf86-video-intel from working by just having either nvidia or nvidia-390xx *installed*.
3) That option also prevented prime render offload from working, causing the nvidia card to always be used, instead of the intel card.
The best way to make outputs to work on desktops (non optimus) again is not to add the option back, but to actually write a xorg.conf. You don't need a PrimaryGPU, but having a xorg.conf file should solve the issue. It might be necessary to add the BusID for each card, but it's not always needed.
Offline
Just having the files installed should not provide an nvidia-drm driver match…?
Anyway.
Making the nvidia GPU primary will make it (drummroll) the primary GPU (unless it's deactivated and the kernel module not loaded through eg. bbswitch what, I assume, simply covered this usecase before)
If you've a multi-GPU system, you'll have to tell the server which GPU to use. One way or another. It's just that different groups of users will tend to have different preferences.
(Which is why I'd tend to agree that no package should implicitly select the primary GPU for the user)
You'll also have to define a sink for the output provider if you want to use an output that's not wired to the used GPU.
Just *having* an (empty) /etc/xorg.conf won't do anything and while you will have to write any configuration that is not covered by an installed package, PLEASE DON'T WRITE A STATIC XORG.CONF! Ever.
Especially not with a server or screen section (ok, unless you really know what you do and it's a very special server layout)
And VERY MOST ESPECIALLY not using "X -configure" or "nvidia-xconfig" (no, also not through nvidia-settings)
Instead add the preferred alterations into /etc/X11/xorg.conf.d
Maybe it'd be feasible to have mutually exclusive multi-GPU/optimus packages that provide the different configs, but otoh this is archlinux, so just "rtfw" ;-)
Offline
Just having the files installed should not provide an nvidia-drm driver match…?
The problem is that the files matched on the intel driver and made them to use modesetting. So, just by having the nvidia package installed, would prevent one from using the intel DDX driver, even if you made a /etc/X11/xorg.conf.d snippet saying you wanted to use that.
Making the nvidia GPU primary will make it (drummroll) the primary GPU (unless it's deactivated and the kernel module not loaded through eg. bbswitch what, I assume, simply covered this usecase before)
Yes, it was ok for optimus setups to have that option, because: a) when using bumblebee it simply didn't matter b) using nvidia-xrun and optimus manager is the same as reverse prime, so it didn't matter either. With the new prime render offload method, it does matter, because it forces the nvidia card to the the primary effectively using reverse prime.
If you've a multi-GPU system, you'll have to tell the server which GPU to use. One way or another. It's just that different groups of users will tend to have different preferences.
(Which is why I'd tend to agree that no package should implicitly select the primary GPU for the user)
You'll also have to define a sink for the output provider if you want to use an output that's not wired to the used GPU.
You don't need to have a PrimaryGPU setting though. You use reverse prime to make outputs not available to the card doing the actual display, and xrandr can use all of them in one single Screen, so all is good. It's the same case when you have a MXM optimus setup, where one of the outputs is wired to the nvidia card.
Just *having* an (empty) /etc/xorg.conf won't do anything and while you will have to write any configuration that is not covered by an installed package, PLEASE DON'T WRITE A STATIC XORG.CONF! Ever.
I used to think like that, but give how X configuration works, it doesn't matter if you use snippets or a xorg.conf. It's all merged into one single, let's say, meta conf and used. So, avoid all caps and it's perfectly fine to have a xorg.conf file.
And VERY MOST ESPECIALLY not using "X -configure" or "nvidia-xconfig" (no, also not through nvidia-settings)
I'd say that X -configure or nvidia-settings can give you a foundation file for you to work on. Don't blindly use them, but you can however get a basis to work on.
Instead add the preferred alterations into /etc/X11/xorg.conf.d
Doesn't matter these days (or did it ever?)
Maybe it'd be feasible to have mutually exclusive multi-GPU/optimus packages that provide the different configs, but otoh this is archlinux, so just "rtfw" ;-)
Not sure how that would work. Also, I think nvidia is going to make the AllowNVIDIAGPUScreens option the default in the driver in the future, so I don't even think the nvidia-prime package will have utility. You're perfectly right that users should RTFM.
Offline
matched on the intel driver and made them to use modesetting
Ok, but that's an adjacent problem.
You don't need to have a PrimaryGPU setting though.
If you have two VGA devices, this would turn the X11 server layout into a lottery?
What controls the active GPU?
it doesn't matter if you use snippets or a xorg.conf. It's all merged into one single
The point is about the flexibility of the setup. Technically you could have all your (local) adaptive sections in one xorg.conf, but it's oc. easier to shuffle them around when they're in locally seperate files (eg. one for keyboard, one for GPU, one for the monitors, one for the mouse …)
Also, traditionally™ the xorg.conf holds a complete server layout description and *that* is what should definitively be avoided which is why…
I'd say that X -configure or nvidia-settings can give you a foundation file
… I will strictly disagree here.
For one because esp. nvidia-xconfig also gets you a lot of cruft (eg. kbd and mouse drivers, incl. Core<Device> …) in the config and both methods will write complete server definitions and by no means do you want to have a static server layout that's then gonna cause you headaches with every hardware change.
The autodetection is reliable and fast, so you should really ever only write those config adjustments that are required. Unless you're running a zaphod server, this is rarely a serverlayout or screen.
Please don't advertise this.
Offline