You are not logged in.

#26 2023-01-18 03:50:13

headkase
Member
Registered: 2011-12-06
Posts: 1,975

Re: [SOLVED] Forcing X server to only support monitor's native resolution

hematinik wrote:

Trust me it sticks to my memory for good. labled as "The Great Headkase's Advice in the Hall of Linux Gods"

lol, right on man.  And now back to your regularly scheduled programming. wink

Offline

#27 2023-01-18 06:33:43

seth
Member
Registered: 2012-09-03
Posts: 49,953

Re: [SOLVED] Forcing X server to only support monitor's native resolution

headkase wrote:

If this works you can use it for everything.  The minimum nVidia driver is R515 and the official repos are carrying R525.  So if you're in sync with the official repos then you should have what you need.

Jan 16 01:35:48 mbp /usr/lib/gdm-x-session[618]: (II) NVIDIA GLX Module  340.108  Wed Dec 11 14:26:50 PST 2019

I don't understand why the config isn't even parsed, iirc nvidia has read across the device and screen section ever since when I tried to make it run Quake3…

Let's see:

Section "Monitor"
    Identifier     "DP-0"
    Modeline       "1920x1080_60"  173.00  1920 2048 2248 2576  1080 1083 1088 1120 -hsync +vsync
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GT 330M"
    Option         "UseEdidDpi" "False"
    Option         "DPI" "96x96"
    Option         "Gnarf" "False"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Option         "ModeValidation" "DP-0: NoEdidModes, NoPredefinedModes, NoXServerModes, NoVesaModes"
    Option         "metamodes" "DP-0: 1920x1080_60 +0+0"
    Option         "Grumpf" "False"
EndSection

YOu might have noticed that there're two invalid options in there and I'd expect them to show up in the xorg log in one way or another.

Online

#28 2023-01-18 07:01:04

hematinik
Member
Registered: 2022-03-23
Posts: 34

Re: [SOLVED] Forcing X server to only support monitor's native resolution

Looks like "Screen" section is essential.

the log is reversed (newest first)

Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) LoadModule: "dri2"
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) Loading sub module "dri2"
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (WW) NVIDIA(0): Option "Grumpf" is not used
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (WW) NVIDIA(0): Option "Gnarf" is not used
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (WW) NVIDIA(0): Option "PrimaryGPU" is not used
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (==) NVIDIA(0): DPMS enabled
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (==) NVIDIA(0): Silken mouse enabled
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (==) NVIDIA(0): Backing store enabled
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (==) NVIDIA(0): Disabling shared memory pixmaps
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0): Setting mode "DFP-1:nvidia-auto-select"
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0):     Config Options in the README.
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0):     "AcpidSocketPath" X configuration options in Appendix B: X
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0):     details, please see the "ConnectToAcpid" and
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0):     try to use it to receive ACPI event notifications.  For
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0):     ACPI event daemon is available, the NVIDIA X driver will
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0):     configuration option may not be set correctly.  When the
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0):     may not be running or the "AcpidSocketPath" X
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0): ACPI: failed to connect to the ACPI event daemon; the daemon
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA: Using 768.00 MB of virtual memory for indirect memory access.
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0): DPI set to (96, 96); computed from "DPI" X config option
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0): Virtual screen size determined to be 800 x 600
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0):     "DFP-1:nvidia-auto-select"
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0): Validated MetaModes:
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (WW) NVIDIA(0):
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (WW) NVIDIA(0):     "nvidia-auto-select".
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (WW) NVIDIA(0): Unable to validate any modes; falling back to the default mode
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (WW) NVIDIA(0):
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (WW) NVIDIA(0): No valid modes for "DP-0:1920x1080_60+0+0"; removing.
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(GPU-0):     NoPredefinedModes
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(GPU-0):     NoXServerModes
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(GPU-0):     NoEdidModes
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(GPU-0):     NoVesaModes
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(GPU-0): Mode Validation Overrides for SAMSUNG (DFP-1):
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0):     enabled on all display devices.)
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0):     device SAMSUNG (DFP-1) (Using EDID frequencies has been
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0):     been enabled on all display devices.)
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0):     device Apple Color LCD (DFP-0) (Using EDID frequencies has
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(GPU-0): DFP-2: 480.0 MHz maximum pixel clock
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(0): DFP-2: Internal DisplayPort
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(GPU-0): SAMSUNG (DFP-1): 165.0 MHz maximum pixel clock
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(0): SAMSUNG (DFP-1): Internal TMDS
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(GPU-0): Apple Color LCD (DFP-0): 330.0 MHz maximum pixel clock
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(0): Apple Color LCD (DFP-0): Internal LVDS
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(0):     DFP-2
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(0):     SAMSUNG (DFP-1) (connected)
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(0):     Apple Color LCD (DFP-0) (boot, connected)
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(0): Valid display device(s) on GeForce GT 330M at PCI:1:0:0
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0): Detected PCI Express Link width: 16X
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(0): VideoBIOS: 70.16.58.0a.00
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (--) NVIDIA(0): Memory: 524288 kBytes
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0): NVIDIA GPU GeForce GT 330M (GT216) at PCI:1:0:0 (GPU-0)
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0):     stereo.
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0): Display (SAMSUNG (DFP-1)) does not support NVIDIA 3D Vision
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0):     Vision stereo.
Jan 18 10:25:19 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0): Display (Apple Color LCD (DFP-0)) does not support NVIDIA 3D
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0): Enabling 2D acceleration
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0): Option "MetaModes" "DP-0: 1920x1080_60 +0+0"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0): Option "AllowEmptyInitialConfiguration"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0): Option "ModeValidation" "DP-0: NoEdidModes, NoPredefinedModes, NoXServerModes, NoVesaModes"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0): Option "DPI" "96x96"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (**) NVIDIA(0): Option "UseEdidDpi" "False"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) Applying OutputClass "nvidia" options to /dev/dri/card0
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (==) NVIDIA(0): Default visual is TrueColor
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (==) NVIDIA(0): RGB weight 888
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]:         "Screen0" for depth/fbbpp 24/32
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA(0): Creating default Display subsection in Screen section
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) Module "ramdac" already built-in
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) LoadModule: "ramdac"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) Loading sub module "ramdac"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]:         ABI class: X.Org ANSI C Emulation, version 0.4
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]:         compiled for 1.21.1.4, module version = 1.0.0
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) Module wfb: vendor="X.Org Foundation"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) Loading /usr/lib/xorg/modules/libwfb.so
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) LoadModule: "wfb"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) Loading sub module "wfb"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (WW) Unresolved symbol: fbGetGCPrivateKey
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) Module "fb" already built-in
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) LoadModule: "fb"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) Loading sub module "fb"
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (II) NVIDIA dlloader X Driver  340.108  Wed Dec 11 14:06:00 PST 2019
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (WW) NVIDIA: This driver was compiled against the X.Org server SDK from commit e6ef2b12404dfec7f23592a3524d2a63d9d25802 and may not be compatible with the final version of this SDK.
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: (WW) NVIDIA: The driver will continue to load, but may behave strangely.
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: =================================================================
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: server with a supported driver ABI.
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: http://www.nvidia.com/ for driver updates or downgrade to an X
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: driver does not officially support.  Please check
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: This server has a video driver ABI version of 25.2 that this
Jan 18 10:25:18 mbp /usr/lib/gdm-x-session[34484]: ================ WARNING WARNING WARNING WARNING ================

Last edited by hematinik (2023-01-21 15:46:48)

Offline

#29 2023-01-18 11:01:31

seth
Member
Registered: 2012-09-03
Posts: 49,953

Re: [SOLVED] Forcing X server to only support monitor's native resolution

Option         "ModeValidation" "DP-0: AllowNonEdidModes, NoEdidModes, NoPredefinedModes, NoXServerModes, NoVesaModes"

Online

#30 2023-01-18 11:39:15

hematinik
Member
Registered: 2022-03-23
Posts: 34

Re: [SOLVED] Forcing X server to only support monitor's native resolution

She's tough neutral

Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) LoadModule: "dri2"
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) Loading sub module "dri2"
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA(0): Option "Grumpf" is not used
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA(0): Option "Gnarf" is not used
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA(0): Option "PrimaryGPU" is not used
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (==) NVIDIA(0): DPMS enabled
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (==) NVIDIA(0): Silken mouse enabled
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (==) NVIDIA(0): Backing store enabled
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (==) NVIDIA(0): Disabling shared memory pixmaps
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0): Setting mode "DFP-1:nvidia-auto-select"
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0):     Config Options in the README.
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0):     "AcpidSocketPath" X configuration options in Appendix B: X
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0):     details, please see the "ConnectToAcpid" and
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0):     try to use it to receive ACPI event notifications.  For
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0):     ACPI event daemon is available, the NVIDIA X driver will
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0):     configuration option may not be set correctly.  When the
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0):     may not be running or the "AcpidSocketPath" X
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0): ACPI: failed to connect to the ACPI event daemon; the daemon
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA: Using 768.00 MB of virtual memory for indirect memory access.
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0): DPI set to (96, 96); computed from "DPI" X config option
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0): Virtual screen size determined to be 800 x 600
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0):     "DFP-1:nvidia-auto-select"
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0): Validated MetaModes:
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA(0):
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA(0):     "nvidia-auto-select".
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA(0): Unable to validate any modes; falling back to the default mode
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA(0):
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA(0): No valid modes for "DP-0:1920x1080_60+0+0"; removing.
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(GPU-0):     AllowNonEdidModes
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(GPU-0):     NoPredefinedModes
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(GPU-0):     NoXServerModes
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(GPU-0):     NoEdidModes
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(GPU-0):     NoVesaModes
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(GPU-0): Mode Validation Overrides for SAMSUNG (DFP-1):
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0):     enabled on all display devices.)
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0):     device SAMSUNG (DFP-1) (Using EDID frequencies has been
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0):     been enabled on all display devices.)
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0):     device Apple Color LCD (DFP-0) (Using EDID frequencies has
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0): Using HorizSync/VertRefresh ranges from the EDID for display
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(GPU-0): DFP-2: 480.0 MHz maximum pixel clock
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(0): DFP-2: Internal DisplayPort
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(GPU-0): SAMSUNG (DFP-1): 165.0 MHz maximum pixel clock
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(0): SAMSUNG (DFP-1): Internal TMDS
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(GPU-0): Apple Color LCD (DFP-0): 330.0 MHz maximum pixel clock
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(0): Apple Color LCD (DFP-0): Internal LVDS
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(0):     DFP-2
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(0):     SAMSUNG (DFP-1) (connected)
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(0):     Apple Color LCD (DFP-0) (boot, connected)
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(0): Valid display device(s) on GeForce GT 330M at PCI:1:0:0
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0): Detected PCI Express Link width: 16X
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(0): VideoBIOS: 70.16.58.0a.00
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (--) NVIDIA(0): Memory: 524288 kBytes
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0): NVIDIA GPU GeForce GT 330M (GT216) at PCI:1:0:0 (GPU-0)
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0):     stereo.
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0): Display (SAMSUNG (DFP-1)) does not support NVIDIA 3D Vision
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0):     Vision stereo.
Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0): Display (Apple Color LCD (DFP-0)) does not support NVIDIA 3D
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0): Enabling 2D acceleration
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0): Option "MetaModes" "DP-0: 1920x1080_60 +0+0"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0): Option "AllowEmptyInitialConfiguration"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0): Option "ModeValidation" "DP-0: AllowNonEdidModes, NoEdidModes, NoPredefinedModes, NoXServerModes, NoVesaModes"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0): Option "DPI" "96x96"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (**) NVIDIA(0): Option "UseEdidDpi" "False"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) Applying OutputClass "nvidia" options to /dev/dri/card0
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (==) NVIDIA(0): Default visual is TrueColor
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (==) NVIDIA(0): RGB weight 888
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]:         "Screen0" for depth/fbbpp 24/32
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA(0): Creating default Display subsection in Screen section
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) Module "ramdac" already built-in
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) LoadModule: "ramdac"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) Loading sub module "ramdac"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]:         ABI class: X.Org ANSI C Emulation, version 0.4
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]:         compiled for 1.21.1.4, module version = 1.0.0
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) Module wfb: vendor="X.Org Foundation"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) Loading /usr/lib/xorg/modules/libwfb.so
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) LoadModule: "wfb"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) Loading sub module "wfb"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (WW) Unresolved symbol: fbGetGCPrivateKey
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) Module "fb" already built-in
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) LoadModule: "fb"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) Loading sub module "fb"
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (II) NVIDIA dlloader X Driver  340.108  Wed Dec 11 14:06:00 PST 2019
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA: This driver was compiled against the X.Org server SDK from commit e6ef2b12404dfec7f23592a3524d2a63d9d25802 and may not be compatible with the final version of this SDK.
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA: The driver will continue to load, but may behave strangely.
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: =================================================================
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: server with a supported driver ABI.
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: http://www.nvidia.com/ for driver updates or downgrade to an X
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: driver does not officially support.  Please check
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: This server has a video driver ABI version of 25.2 that this
Jan 18 15:05:41 mbp /usr/lib/gdm-x-session[16556]: ================ WARNING WARNING WARNING WARNING ================

Last edited by hematinik (2023-01-21 15:49:18)

Offline

#31 2023-01-18 11:48:33

hematinik
Member
Registered: 2022-03-23
Posts: 34

Re: [SOLVED] Forcing X server to only support monitor's native resolution

Is it possible to override modes after the server starts? like what you do with xrandr script files. this is what I use when Im on another install which uses intel graphics. can we do something similar with NVIDIA?

#!/bin/sh
xrandr --output LVDS-1 --off --output VGA-1 --off --output HDMI-1 --primary --mode 1920x1080 --pos 0x0 --rotate normal --output DP-1 --off

EDIT: I mean doing more than just turning off the other display and setting resolution!

Last edited by hematinik (2023-01-18 11:49:56)

Offline

#32 2023-01-18 11:57:42

seth
Member
Registered: 2012-09-03
Posts: 49,953

Re: [SOLVED] Forcing X server to only support monitor's native resolution

This changes the active mode and should™ work w/ the nvidia drivers as well, but it doesn't prevent anything from trying to select a different (available) mode

This here is the problem

Jan 18 15:05:42 mbp /usr/lib/gdm-x-session[16556]: (WW) NVIDIA(0): Unable to validate any modes; falling back to the default mode

and for some reason the

Modeline       "1920x1080_60"  173.00  1920 2048 2248 2576  1080 1083 1088 1120 -hsync +vsync

doesn't show up in the xorg log.

You can however try to use xrandr --addmode/rmmode/delmode to edit the modes at runtime.

Online

#33 2023-01-20 17:42:00

hematinik
Member
Registered: 2022-03-23
Posts: 34

Re: [SOLVED] Forcing X server to only support monitor's native resolution

So as it turns out this problem is more than just an edge case. I can not perform the simple task of adding a new mode via xrandr and after some googling I found similar cases with no certain solution. no matter what's inside xorg.conf the results are always the same

$ xrandr --newmode "1024x768_50.00"   52.00  1024 1072 1168 1312  768 771 775 793 -hsync +vsync
$ xrandr --addmode DP-0 1024x768_50.00

X Error of failed request:  BadMatch (invalid parameter attributes)
  Major opcode of failed request:  140 (RANDR)
  Minor opcode of failed request:  18 (RRAddOutputMode)
  Serial number of failed request:  27
  Current serial number in output stream:  28

I tried the solution suggested here with no avail
https://bbs.archlinux.org/viewtopic.php?id=255287

also tried some from here
https://askubuntu.com/questions/235507/ … r-badmatch

The longer I'm unable to solve this the stronger my OCD gets. I might need to visit a shrink after this..

Last edited by hematinik (2023-01-20 17:51:32)

Offline

#34 2023-01-20 19:14:13

seth
Member
Registered: 2012-09-03
Posts: 49,953

Re: [SOLVED] Forcing X server to only support monitor's native resolution

https://bbs.archlinux.org/viewtopic.php?id=255287 resulted in using viewport transformation, it doesn't alter the available modes.

X Error of failed request:  BadMatch (invalid parameter attributes)

You'll have to add "AllowNonEdidModes" and perhaps some of all of the "No*Check" options to the ModeValidation.

Online

#35 2023-01-21 10:57:44

hematinik
Member
Registered: 2022-03-23
Posts: 34

Re: [SOLVED] Forcing X server to only support monitor's native resolution

I found the pesky bug that did not allow us to define modelines.
The "AllowNonEdidModes" is necessary but not enough, the modelines produced by "cvt" or "gtf" were all 60Hz and while my monitor DO support "1920x1080 60Hz" it could not contained in modeline! strangely 50Hz modelines with "_50.00" suffix are ok. I used the output from "xvidtune -show" for modeline and it worked after I added "AllowNonEdidModes".

The ultimate xorg.conf

# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings:  version 340.108  (buildmeister@swio-display-x64-rhel04-01)  Wed Dec 11 15:13:22 PST 2019

Section "Monitor"
    # HorizSync source: edid, VertRefresh source: edid
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "SAMSUNG"

    # THIS MODELINE DOES NOT WORK!
    # Modeline      "1920x1080_60.00"  173.00  1920 2048 2248 2576  1080 1083 1088 1120 -hsync +vsync

    # THIS MODELINE WORKS
    # Modeline      "1920x1080_50.00"  141.50  1920 2032 2232 2544  1080 1083 1088 1114 -hsync +vsync

    # THIS IS THE DEFAULT MODELINE XORG USES WHEN NO MODELINE IS PROVIDED BY USER
    Modeline       "1920x1080"   148.50   1920 2008 2052 2200   1080 1084 1089 1125 +hsync +vsync

    HorizSync       15.0 - 135.0
    VertRefresh     24.0 - 75.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GT 330M"
    Option         "UseEDID" "false"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "Stereo" "0"

    Option         "ModeValidation" "DP-0: NoEdidModes, NoPredefinedModes, NoXServerModes, NoVesaModes, AllowNonEdidModes"
    Option         "metamodes" "DP-0: 1920x1080_60 +0+0"

    Option         "SLI" "Off"
    Option         "MultiGPU" "Off"
    Option         "BaseMosaic" "off"
    SubSection     "Display"
        Depth       24
        Option      "UseEdidDpi" "False"
        Option      "DPI" "96x96"
    EndSubSection
EndSection

Now I can add/remove modes using xrandr and this is what I have after X server start:

$ xrandr -q

Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 8192 x 8192
LVDS-0 connected primary (normal left inverted right x axis y axis)
   800x600       60.32 +
DP-0 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 0mm x 0mm
   1920x1080     60.00*+
DP-1 disconnected (normal left inverted right x axis y axis)

There is a big BUT here! (no pun intended)
the xrandr 1.1 still shows the same result as before, which is not the output that WINE demands

$ xrandr --q1

 SZ:    Pixels          Physical       Refresh
*0   1920 x 1080   ( 508mm x 285mm )  *50  
 1   1680 x 1050   ( 444mm x 277mm )   51  
 2   1440 x 900    ( 381mm x 238mm )   52  
 3   1366 x 768    ( 361mm x 203mm )   53  
 4   1280 x 1024   ( 338mm x 270mm )   54  
 5   1280 x 800    ( 338mm x 211mm )   55  
 6   1280 x 720    ( 338mm x 190mm )   56  
 7   1024 x 768    ( 270mm x 203mm )   57  
 8    800 x 600    ( 211mm x 158mm )   58  
 9    640 x 480    ( 169mm x 127mm )   59  
Current rotation - normal
Current reflection - none
Rotations possible - normal left inverted right 
Reflections possible - X Axis Y Axis

Last edited by hematinik (2023-01-21 11:24:09)

Offline

#36 2023-01-21 15:58:48

seth
Member
Registered: 2012-09-03
Posts: 49,953

Re: [SOLVED] Forcing X server to only support monitor's native resolution

strangely 50Hz modelines with "_50.00" suffix are ok

The modeline label is irrelevant, you added a 50Hz modeline.
Also

cvt12 1920 1080 50
# 1920x1080 @ 50.000 Hz (CVT) field rate 49.929 Hz; hsync: 55.621 kHz; pclk: 141.50 MHz
Modeline "1920x1080_50.00"  141.50  1920 2032 2232 2544  1080 1083 1088 1114 -hsync +vsync

The modeline that doesn't work has a pxclk of 173.00, the 50Hz one 141.5 and the default one 148.50
Either of

cvt12 1920 1080 60 -r
# 1920x1080 @ 60.000 Hz Reduced Blank (CVT) field rate 59.934 Hz; hsync: 66.587 kHz; pclk: 138.50 MHz
Modeline "1920x1080_60.00_rb1"  138.50  1920 1968 2000 2080  1080 1083 1088 1111 +hsync -vsync
cvt12 1920 1080 60 -b
# 1920x1080 @ 60.000 Hz Reduced Blank (CVT) field rate 60.000 Hz; hsync: 66.660 kHz; pclk: 133.32 MHz
Modeline "1920x1080_60.00_rb2"  133.32  1920 1928 1960 2000  1080 1097 1105 1111 +hsync -vsync

will likely work as well.

the xrandr 1.1 still shows the same result as before, which is not the output that WINE demands

Why does wine care about the randr 1.1 protocol? It doesn't didn't - back in 2001 - support output management, so that's no surprise.
Did this actually work w/o the external output?
What if you extend modevalidation and metamode selection to the internal display?

Online

#37 2023-01-21 19:56:39

hematinik
Member
Registered: 2022-03-23
Posts: 34

Re: [SOLVED] Forcing X server to only support monitor's native resolution

seth wrote:

The modeline that doesn't work has a pxclk of 173.00, the 50Hz one 141.5 and the default one 148.50

Oh now I see! thanks indeed smile

seth wrote:

Why does wine care about the randr 1.1 protocol?

Actually it is nVidia being the culprit (sorry I wasn't accurate). I am not an expert so directly from Wine FAQ:

"nVidia's proprietary drivers for GNU/Linux intentionally do not properly implement newer versions of RandR, which Wine normally relies upon. This could present problems, particularly in software that attempts to change resolution or output to multiple monitors"

Wine works fine without external monitor plugged in, I tried what you said (mode validation etc.. with internal monitor) and everything remained normal. the funny thing is that I have a long list of modes when I use "xrandr --q1" but that doesn't seems to affect Wine. maybe that it is not an indicator after all and Im staring at a dead gauge lol..

Offline

Board footer

Powered by FluxBB