You are not logged in.
Pages: 1
I'm trying to follow these suggestions: Find a joint, consistent setup with inconsistently-HiDPI monitors for xorg / X11.
As part of this, I'm want to use my lowDPI monitor (1920 x 1080) to display a scaled down "virtual" 4k-HiDPI screen.
I'm attempting to do this via:
xrandr --fb 3840x2160 --output eDP-1-1 --scale 2 --panning 3840x2160
which returns
X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: 140 (RANDR)
Minor opcode of failed request: 29 (RRSetPanning)
Serial number of failed request: 48
Current serial number in output stream: 48
I'm expecting a "tiny" version of my regular desktop on my screen. But I can only see the top-left corner of a seemingly 4k desktop.
How can I fix the xrandr errors?
How can I scale and display my desktop as intended?
How can I find out what is actually going wrong?
For the reference, further details of my setup:
I'm using a NVIDIA Quadro T2000 with the proprietary nvidia drivers, lightdm and i3.
My usual display setup is created using
xrandr --auto
indirectly via lightdm.
I tried the scaled xrandr parameters above directly (particularly before i3 is launched), but without any effect (same behavior as described above).
Xorg / X11 configuration
Section "Module"
Load "modesetting"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusId "PCI:1:0:0"
Option "AllowEmptyInitialConfiguration"
EndSection
xrandr output (before attempting the above)
Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 32767 x 32767
DP-0 disconnected primary (normal left inverted right x axis y axis)
DP-1 disconnected (normal left inverted right x axis y axis)
HDMI-0 disconnected (normal left inverted right x axis y axis)
eDP-1-1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm
1920x1080 60.00*+ 59.97 59.96 59.93
1680x1050 59.95 59.88
1400x1050 59.98
1600x900 59.99 59.94 59.95 59.82
1280x1024 60.02
1400x900 59.96 59.88
1280x960 60.00
1440x810 60.00 59.97
1368x768 59.88 59.85
1280x800 59.99 59.97 59.81 59.91
1280x720 60.00 59.99 59.86 59.74
1024x768 60.04 60.00
960x720 60.00
928x696 60.05
896x672 60.01
1024x576 59.95 59.96 59.90 59.82
960x600 59.93 60.00
960x540 59.96 59.99 59.63 59.82
800x600 60.00 60.32 56.25
840x525 60.01 59.88
864x486 59.92 59.57
700x525 59.98
800x450 59.95 59.82
640x512 60.02
700x450 59.96 59.88
640x480 60.00 59.94
720x405 59.51 58.99
684x384 59.88 59.85
640x400 59.88 59.98
640x360 59.86 59.83 59.84 59.32
512x384 60.00
512x288 60.00 59.92
480x270 59.63 59.82
400x300 60.32 56.34
432x243 59.92 59.57
320x240 60.05
360x202 59.51 59.13
320x180 59.84 59.32
Last edited by m8mble (2019-12-28 14:02:15)
Offline
xrandr --output eDP-1-1 --scale 2x2
You don't have to worry about panning and fb size in this context (but maybe if your setup indeed becomes more complex)
Offline
Thanks for your help.
Your suggestion results in the exact same behavior (seeing the top-left fraction of a seemingly 4k display). Admittedly, the error messages vanish though (and the command is shorter to reproduce the problem).
xrandr (without arguments) reports a 4k screen connected to eDP-1-1, but apparently the scaling doesn't "work" -- at least it visually appears this way.
Any hints or suggestions how to debug this further?
Offline
What's the actual xrandr output after the change?
Since you're using the nvidia blob, there's a viewport manager in nvidia-settings, (X Server Display Config, push the "Advanced…" button) - does that work better?
Offline
xrandr output (after your calling xrandr with your arguments)
Screen 0: minimum 8 x 8, current 3840 x 2160, maximum 32767 x 32767
DP-0 disconnected primary (normal left inverted right x axis y axis)
DP-1 disconnected (normal left inverted right x axis y axis)
HDMI-0 disconnected (normal left inverted right x axis y axis)
eDP-1-1 connected 3840x2160+0+0 (normal left inverted right x axis y axis) 344mm x 193mm
1920x1080 60.00*+ 59.97 59.96 59.93
1680x1050 59.95 59.88
1400x1050 59.98
1600x900 59.99 59.94 59.95 59.82
1280x1024 60.02
1400x900 59.96 59.88
1280x960 60.00
1440x810 60.00 59.97
1368x768 59.88 59.85
1280x800 59.99 59.97 59.81 59.91
1280x720 60.00 59.99 59.86 59.74
1024x768 60.04 60.00
960x720 60.00
928x696 60.05
896x672 60.01
1024x576 59.95 59.96 59.90 59.82
960x600 59.93 60.00
960x540 59.96 59.99 59.63 59.82
800x600 60.00 60.32 56.25
840x525 60.01 59.88
864x486 59.92 59.57
700x525 59.98
800x450 59.95 59.82
640x512 60.02
700x450 59.96 59.88
640x480 60.00 59.94
720x405 59.51 58.99
684x384 59.88 59.85
640x400 59.88 59.98
640x360 59.86 59.83 59.84 59.32
512x384 60.00
512x288 60.00 59.92
480x270 59.63 59.82
400x300 60.32 56.34
432x243 59.92 59.57
320x240 60.05
360x202 59.51 59.13
320x180 59.84 59.32
nvidia-settings comlain about PRIME Displays not being controlled by nvidia. They should be configured by an external RandR capable tool. (Paraphrasing since I can't copy from said "tool".)
Offline
So this is an optimus system… what probably explains what you see.
What's the actual configuration (how do you use the optimus system? eDP-1-1 is probably wired to the intel chip?)
Offline
Yes, it's an optimus environment -- however you figured that out.
I use a Lenovo P1 where I use the nvidia card only (at least that what I intended). My config basically is the suggestion provided for this setup here: https://wiki.archlinux.org/index.php/NV … phics_only.
I don't know for sure about eDP-1-1 being hard wired to the intel chip, but I presume it to be: eDP-1-1 is the internal monitor.
Do you think, scaling eDP-1-1 is not possible at all? I was assuming it actually would be, since all the xorg / RandR config is agnostic of the underlying hardware.
Last edited by m8mble (2019-12-30 08:04:07)
Offline
Probably not in this configuration, https://devtalk.nvidia.com/default/topi … lue-error/
The problem is that you're talking to the nvidia chip (and it scales the output) but the intel chip doesn't notice that (hence the corner-only display)
It should work fine when scaling the intel output (eg. in a bumblebee setup) - and *may* work if scaling the ouput before setting the output source (as well)
Offline
Thanks for the clarification.
and *may* work if scaling the ouput before setting the output source (as well)
How do I try that?
Offline
Scale the output before calling "xrandr --setprovideroutputsource modesetting NVIDIA-0"
Offline
This sadly is a no: Doing anything with xrandr before setprovideroutputsource fails (crashes lightdm resulting in an endless, uninterruptable restart loop -- great fun). Omitting setprovideroutputsource entirely also doesn't scale as hoped for.
To summarize: I've learned that scaling via optimus isn't possible. Thank your for your help figuring this out.
Do you have any suggestions on my mixed-DPI setup (internal low-DPI, external 4k)?
Last edited by m8mble (2019-12-30 11:17:27)
Offline
For the experimentation stage, I'd take lightdm out of the equation and just use startx.
Can you just "xrandr --fb 3840x2160" before setting the provider output source? (And does it help)?
If the specific optimus usage isn't mandatory, I'd just run on the intel chip and optirun games etc. on the nvidia one.
In that case scaling the output should™ not be any kind of issue.
Offline
First, I don't think it's an i3 issue
With the following .xinitrc, and startx I just get a black screen:
#!/usr/bin/env
xrandr --fb 3840x2160
xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --output eDP-1-1 --scale 2
i3
With the alternative below, the exact same happens:
#!/usr/bin/env
xrandr --output eDP-1-1 --scale 2
i3
Running Intel-only would be fine, as long as I get the 4k external to work on its native resolution. Is there an easier way to test this other than powering NVIDIA down using bbswitch? Considering the fact that scaling has never ever worked so far, do you expect it to actually work in this case?
Offline
I have the same issue as the OP and have been unable to figure this out either. I have a Thinkpad P52 with a 1080p display.
Using the Intel driver and Nvidia driver in a Bumblebee setup allows me to scale the laptop's display and run a 4K monitor at 3840x2160 using the Nvidia dGPU using intel-virtual-output. Previously I was using an xrandr command like the following after starting intel-virtual-output:
xrandr --output VIRTUAL4 --mode VIRTUAL4.545-3840x2160 --pos 2880x0 --primary \
--output eDP1 --auto --scale 1.5 --pos 0x0
This works ok but I prefer to get away from using Bumblebee as I have some weird issues with it. For one, it's annoying to set modes when connected to monitors on various ports as xrandr displays a huge list of virtual modes, most of which are not valid and do not work. Plus, starting another X server on display :8 seems like a hack.
Another reason I prefer to stay away from the Intel X driver is that it is buggy. See:
https://bugs.archlinux.org/task/64725
https://gitlab.freedesktop.org/drm/intel/issues/673
There's still no fix backported to Linux 5.4.x "stable". The modesetting driver seems much more stable and better supported in my experience at least.
Switching to Nvidia PRIME offload seems to be the officially supported option. It works nicely with the modesetting driver and a single X server on display :0. xrandr displays only the supported modes and makes it easy to switch modes. The only thing that doesn't work as I would like is scaling. I'm able to get almost the equivalent of the Bumblebee setup by running:
xrandr --output DP-0.3 --auto --primary \
--output eDP-1-1 --auto --left-of DP-0.3
Adding a --scale 1.5 option along with the pos options I had before does nothing except grow my laptop's display past the screen boundary. I've tried various --panning and --fb options but nothing seems to help. Until I figure this out, I'm just living with my laptop displaying huge fonts. I did notice that playing with the Xft.dpi property affects scaling but it affects both outputs so it does not suffice.
Offline
You will not be able to "stay away from the Intel" kernel module *ever* (because it drives the output and that's what the linked bug seems about, i915 is the kernel module) and xf86-video-intel (the X11 driver) is not required for bumblebee. It should™ work w/ the modesetting driver just fine (though you might have some xorg configlet flying around, specifically defining the intel driver and thus impeding the X11 server start. Just remove the config in case)
Because of the devtalk thread linked in comment #8, and the subsequent tests here, I doubt scaling is currently possible when only using the intel chip as sink for the nvidia output
Maybe scaling the output ahead of the provider output source definition works.
Offline
Yeah, I realize I still am using the i915 kernel module no matter what, but for whatever reason the modesetting driver is more stable than the xf86-video-intel driver. I haven't hit any GPU hangs with the modesetting driver.
I did come across the devtalk thread previously but I was hoping for some kind of workaround. Looks like I'm stuck. I can use either:
1) Bumblebee w/ modesetting driver or xf86-video-intel and put up with Bumblebee weirdness and possibly GPU hangs w/ xf86-video-intel but working scale option
2) Nvidia PRIME w/ modesetting driver using the Intel chip as the sink for Nvidia output with no working scale
For now, I'm sticking with 2) and just lowering font sizes manually for terminals and programs that I tend to run on my laptop's 1080p screen.
Thanks for your help!
Offline
You could possibly "xrandr --dpi 144 to pick a resolution between the low and high DPI outputs (assuming here one is 96 and the other 192, but you can randomly pick DPI values until you got a good trade off)
nb. that the change will only affect new processes.
Offline
Pages: 1