You are not logged in.
Ever since I installed Arch Linux about 1.5 years ago, I have had a problem with my nvidia graphics card. I have tried a few times to fix it, but I have given up every time because I never really needed extra performance out of my laptop. But recently I have started using more graphics intensive software and I also want to setup dual monitors: the built-in one from my laptop and an external monitor connected to the laptop via HDMI.
In the past few days I have been tinkering with my xorg.conf, unable to fix my problem. But I have come across a phenomenon that I haven't seen mentioned in any forum I have checked.
It is unlikely that this is due to a recent system update because I noticed this last year while I was messing with xorg.conf, but I lost track of what I did and I couldn't reproduce it until now.
Here's my xorg.conf. The commented parts are my attempts to fix the issue and are taken from various forums. I included them in case they can give additional insight and reduce duplication of effort. I did not necessarily include all of the commented lines at once in my attempts.
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "PCI:1:0:0"
Option "AllowEmptyInitialConfiguration"
EndSection
Section "Device"
Identifier "intel"
Driver "modesetting"
BusID "PCI:0:2:0"
EndSection
Section "Screen"
Identifier "Screen0"
Device "intel"
Option "AllowEmptyInitialConfiguration"
EndSection
#Section "Module"
# Load "nv-control"
#EndSection
# Section "ServerLayout"
# Identifier "layout"
# Option "AllowNVIDIAGPUScreens"
# EndSection
#Section "Screen"
# Identifier "Screen0"
# Device "intel"
# GPUDevice "nvidia"
# Option "GlxVendorLibrary" "nvidia-libgl"
# Monitor "Monitor0"
# Option "AllowEmptyInitialConfiguration"
#EndSection
#Section "Screen"
# Identifier "Screen1"
# Device "intel"
# Monitor "Monitor1"
# Option "AllowEmptyInitialConfiguration"
#EndSection
# Section "Monitor"
# Identifier "HDMI-0"
# Option "Primary" "true"
# EndSection
#Section "Monitor"
# Identifier "eDP-1"
#EndSection
#Section "OutputClass"
# Identifier "nvidia"
# MatchDriver "nvidia"
# Driver "nvidia"
# Option "AllowEmptyInitialConfiguration"
# ModulePath "/usr/lib/nvidia/xorg"
# ModulePath "/usr/lib/xorg/modules"
#EndSection
When I run startx with this config everything gets displayed on my laptop screen. 'xrandr --listmonitors' displays the following:
Monitors: 1
0: +*eDP-1 1920/344x1080/194+0+0 eDP-1
However, if I change the "Screen" section like this (change entry Device from "intel" to "nvidia"):
Section "Screen"
Identifier "Screen0"
Device "nvidia" # This line has changed
Option "AllowEmptyInitialConfiguration"
EndSection
and then run startx, everything gets displayed on the external monitor. When I run 'xrandr --listmonitors' I get this:
Monitors: 1
0: +*HDMI-0 1680/433x1050/270+0+0 HDMI-0
In either case, xrandr shows me only the monitor that the output is to. This is the only thing that I didn't come across anywhere else on the internet.
There are other issues as well. For the remainder of this post, I will refer to the former configuration as the INTEL case and with the latter as the NVIDIA case.
1. I can't run 'nvidia-settings' in the INTEL case. It doesn't open and I get the following output (with --verbose):
In the NVIDIA case, nvidia-settings opens fine.
2. 'prime-run glxinfo' displays the following errors:
a) In the INTEL case:
name of display: :0
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 152 (GLX)
Minor opcode of failed request: 24 (X_GLXCreateNewContext)
Value in failed request: 0x0
Serial number of failed request: 50
Current serial number in output stream: 51
b) In the NVIDIA case:
name of display: :1
X Error of failed request: BadWindow (invalid Window parameter)
Major opcode of failed request: 156 (NV-GLX)
Minor opcode of failed request: 4 ()
Resource id in failed request: 0x1a00003
Serial number of failed request: 41
Current serial number in output stream: 41
3. 'gputest' always shows this, whatever demo I run:
https://i.imgur.com/qZtLgHI.png
4. 'prime-run gputest' -- there seems to be a problem with OpenGL?
libEGL warning: DRI3: Screen seems not DRI3 capable
libEGL warning: DRI2: failed to authenticate
libEGL warning: DRI3: Screen seems not DRI3 capable
libEGL warning: DRI2: failed to authenticate
ERROR: Error while querying attribute 'OpenGLVersion' on haris-gusic:1.0 (Unknown Error).
libEGL warning: DRI3: Screen seems not DRI3 capable
libEGL warning: DRI2: failed to authenticate
ERROR: Error while querying attribute 'OpenGLVersion' on haris-gusic:1.0 (Unknown Error).
libEGL warning: DRI3: Screen seems not DRI3 capable
libEGL warning: DRI2: failed to authenticate
ERROR: Error while querying attribute 'OpenGLVersion' on haris-gusic:1.0 (Unknown Error).
libEGL warning: DRI3: Screen seems not DRI3 capable
libEGL warning: DRI2: failed to authenticate
X Error of failed request: BadWindow (invalid Window parameter)
Major opcode of failed request: 156 (NV-GLX)
Minor opcode of failed request: 4 ()
Resource id in failed request: 0x1e00002
Serial number of failed request: 39
Current serial number in output stream: 39
libEGL warning: DRI3: Screen seems not DRI3 capable
libEGL warning: DRI2: failed to authenticate
5. Output of 'nvidia-bug-report.sh'
You can see it here: https://pastebin.com/NUqCsugK
I have noticed the following lines:
[ 267.158] (EE) No devices detected.
[ 267.158] (EE)
Fatal server error:
[ 267.158] (EE) no screens found(EE)
[ 267.158] (EE)
Please consult the The X.Org Foundation support
at http://wiki.x.org
for help.
[ 267.158] (EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
[ 267.158] (EE)
[ 267.180] (EE) Server terminated with error (1). Closing log file.
And here's Xorg.log.0: https://pastebin.com/8n62SnTW
The only thing here that seems relevant is this:
[ 1738.626] (EE) Screen 1 deleted because of no matching config section.
Output of 'lspci -v | grep -A 10 NVIDIA':
01:00.0 VGA compatible controller: NVIDIA Corporation GP107M [GeForce GTX 1050 Mobile] (rev a1) (prog-if 00 [VGA controller])
Subsystem: Acer Incorporated [ALI] Device 1197
Flags: bus master, fast devsel, latency 0, IRQ 133
Memory at a3000000 (32-bit, non-prefetchable) [size=16M]
Memory at 90000000 (64-bit, prefetchable) [size=256M]
Memory at a0000000 (64-bit, prefetchable) [size=32M]
I/O ports at 4000 [size=128]
Expansion ROM at a4080000 [virtual] [disabled] [size=512K]
Capabilities: <access denied>
Kernel driver in use: nvidia
Kernel modules: nouveau
--
01:00.1 Audio device: NVIDIA Corporation GP107GL High Definition Audio Controller (rev a1)
Subsystem: Acer Incorporated [ALI] Device 1197
Flags: bus master, fast devsel, latency 0, IRQ 17
Memory at a4000000 (32-bit, non-prefetchable) [size=16K]
Capabilities: <access denied>
Kernel driver in use: snd_hda_intel
Kernel modules: snd_hda_intel
Output of 'lshw -C video'
*-display
description: VGA compatible controller
product: GP107M [GeForce GTX 1050 Mobile]
vendor: NVIDIA Corporation
physical id: 0
bus info: pci@0000:01:00.0
version: a1
width: 64 bits
clock: 33MHz
capabilities: pm msi pciexpress vga_controller bus_master cap_list rom
configuration: driver=nvidia latency=0
resources: irq:16 memory:a3000000-a3ffffff memory:90000000-9fffffff memory:a0000000-a1ffffff ioport:4000(size=128) memory:a4080000-a40fffff
*-display
description: VGA compatible controller
product: HD Graphics 630
vendor: Intel Corporation
physical id: 2
bus info: pci@0000:00:02.0
version: 04
width: 64 bits
clock: 33MHz
capabilities: pciexpress msi pm vga_controller bus_master cap_list rom
configuration: driver=i915 latency=0
resources: irq:130 memory:a2000000-a2ffffff memory:b0000000-bfffffff ioport:5000(size=64) memory:c0000-dffff
Installed packages (pacman -Qs nvidia):
local/egl-wayland 1.1.6-1
EGLStream-based Wayland external platform
local/lib32-nvidia-utils 465.27-1
NVIDIA drivers utilities (32-bit)
local/libvdpau 1.4-1
Nvidia VDPAU library
local/libxnvctrl 465.27-1
NVIDIA NV-CONTROL X extension
local/nvidia 465.27-2
NVIDIA drivers for linux
local/nvidia-prime 1.0-4
NVIDIA Prime Render Offload configuration and utilities
local/nvidia-settings 465.27-1
Tool for configuring the NVIDIA graphics driver
local/nvidia-utils 465.27-1
NVIDIA drivers utilities
local/nvtop 1.1.0-2
An htop like monitoring tool for NVIDIA GPUs
local/opencl-nvidia 465.24.02-1
OpenCL implemention for NVIDIA
Some more system information (from neofetch):
OS: Arch Linux x86_64
Host: Aspire A715-71G V1.09
Kernel: 5.11.15-arch1-2
Packages: 1832 (pacman)
Shell: dash
Resolution: 1920x1080
WM: i3
Terminal: alacritty
CPU: Intel i5-7300HQ (4) @ 3.500GHz
GPU: Intel HD Graphics 630
GPU: NVIDIA GeForce GTX 1050 Mobile
I would like to thank you in advance from taking the time to read this and for your help. I hope I haven't bombarded you with too much unnecessary information.
Last edited by HarisGusic (2021-05-17 20:49:12)
Offline
https://wiki.archlinux.org/title/PRIME#Reverse_PRIME
The important bit is the output redirection w/ xrandr, check "xrandr --listproviders" to see yours for accurate indexes/symbols
Not sure about the prime-run situation, though.
Please post a complete Xorg log on your current/updated xorg config.
Offline
https://wiki.archlinux.org/title/PRIME#Reverse_PRIME
The important bit is the output redirection w/ xrandr, check "xrandr --listproviders" to see yours for accurate indexes/symbolsNot sure about the prime-run situation, though.
Please post a complete Xorg log on your current/updated xorg config.
I have copied the configuration and done all the steps from the section you
linked. Another problem was that the nvidia modules were not loaded for some
reason, but after adding them to mkinitcpio.conf, everything works like a charm.
Both monitors work and prime-run also seems to work. Xorg logs do not report any errors.
Thank you. You have savedme hours of work!
Offline