You are not logged in.

#1 2023-08-31 17:23:04

Boux
Member
Registered: 2023-08-31
Posts: 3

[SOLVED] Strange behavior in games with "--filter nearest" with xrandr

I have been using this command to scale my games from a lower resolution but still keep big crisp non-blurry pixels when it scales up to my monitor's native resolution, gives a cool look to retro 3D games or low poly stuff like minecraft or deep rock galactic

xrandr --output DP-0 --scale 0.5 --filter nearest

But recently it hasn't been working, it just freezes my games, or just makes it so slow that it's unresponsive for multiple seconds at a time, even tabbing out of my game takes like 15-20 seconds. Some games just crash out instantly when I fullscreen.

I am using KDE + i3wm, and here's the weird part: It only happens when my game is fullscreen (but not exactly).

  • If the game is running windowed mode and I hit my keyboard shortcut to fullscreen (using i3wm), it freezes

  • If the game thinks it's running fullscreen (in the game options), but I manually make it a tiled or floating window before scaling down my resolution, it DOESN'T freeze

  • If for some reason the game is running fullscreen but not exactly at my monitor's native resolution (this happens if I execute the xrandr command while the game is running fullscreen, the game will still be fullscreen, but it will overflow my monitor's boundaries), THE GAME DOES NOT FREEZE. When i hit my keyboard shortcut to toggle fullscreen, it goes out of fullscreen into a tiled window (it still thinks it's running fullscreen in the options), still no freeze, then if I hit it again, it goes back to fullscreen but this time it fits to my monitor's boundaries, now the freeze happens


I recently reinstalled Arch (about 2 months ago). I had originally installed it with the archinstall script but this time I manually installed it. I've been having the issue since reinstalling but maybe it's not what triggered the issue, I haven't tried to run a game with this since like 5-6 months ago, but I was doing it a lot back then and it worked great (I used this on both Arch installed with archinstall and on Manjaro a long time ago)

I wouldn't know how to start diagnosing this issue, I don't know if there's a package I should be installing that I forgot to, but here's what I tried or noted so far:

  • I tried different DEs, I'm currently running KDE with i3 replacing kwin. I tried with vanilla i3 and vanilla KDE, the exact same issue happens.

  • I tried with both windows games running on Proton and native games such as CSGO, same issue

  • I've used different scaling values, such as 0.333 and 0.25, same issue

  • I disabled my compositor (picom) before running the game, same issue

  • If I scale down my resolution, WITHOUT nearest filtering, there is no freeze or any lag whatsoever, but it's blurry as all hell

    xrandr --output DP-0 --scale 0.5
  • Going back to my native resolution fixes the issue, the game even becomes responsive again without needing a restart

    xrandr --output DP-0 --scale 1
  • CPU, GPU or RAM usage does not spike at all during the freeze

  • I tried replicating the issue with a non-game (such as GOverlay's vkbasalt testing program, the spinning cube), it's a different issue. This program is usually vsynced, but when I fullsceen it, it runs at 3000 fps (only with --filter nearest, it stays vsynced when fullscreen without the filter).

  • I tried with emulators (N64, PS2 and Switch), and there is no freeze, games run perfectly.

EDIT: I forgot to mention
This seems to only affect games running through steam, I have a few games on Lutris and none of them seem to freeze. I did try disabling the steam overlay and it still froze.

I am using Nvidia, it could be part of the issue, here's my full hardware info just in case, let me know if you guys need more info, or the output of some other command, though I could take a while to respond because I'm going to work smile

> inxi -Fazy

System:
  Kernel: 6.4.12-arch1-1 arch: x86_64 bits: 64 compiler: gcc v: 13.2.1
    clocksource: tsc available: hpet,acpi_pm
    parameters: BOOT_IMAGE=/boot/vmlinuz-linux
    root=UUID=fc8f6751-36f9-4d69-85bc-82fa28fa76d3 rw nvidia-drm.modeset=1
    udev.log_priority=3
  Desktop: KDE Plasma v: 5.27.7 tk: Qt v: 5.15.10 wm: i3 v: 4.22 vt: 2
    dm: SDDM Distro: Arch Linux
Machine:
  Type: Desktop Mobo: ASUSTeK model: PRIME X570-P v: Rev X.0x
    serial: <superuser required> UEFI: American Megatrends v: 1201
    date: 09/09/2019
CPU:
  Info: model: AMD Ryzen 7 3800X bits: 64 type: MT MCP arch: Zen 2 gen: 3
    level: v3 note: check built: 2020-22 process: TSMC n7 (7nm) family: 0x17 (23)
    model-id: 0x71 (113) stepping: 0 microcode: 0x8701013
  Topology: cpus: 1x cores: 8 tpc: 2 threads: 16 smt: enabled cache:
    L1: 512 KiB desc: d-8x32 KiB; i-8x32 KiB L2: 4 MiB desc: 8x512 KiB L3: 32 MiB
    desc: 2x16 MiB
  Speed (MHz): avg: 2281 high: 2795 min/max: 2200/4559 boost: enabled
    scaling: driver: acpi-cpufreq governor: ondemand cores: 1: 2196 2: 2195
    3: 2196 4: 2200 5: 2195 6: 2795 7: 2195 8: 2196 9: 2196 10: 2200 11: 2364
    12: 2200 13: 2196 14: 2794 15: 2196 16: 2195 bogomips: 124621
  Flags: avx avx2 ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 sse4a ssse3 svm
  Vulnerabilities:
  Type: gather_data_sampling status: Not affected
  Type: itlb_multihit status: Not affected
  Type: l1tf status: Not affected
  Type: mds status: Not affected
  Type: meltdown status: Not affected
  Type: mmio_stale_data status: Not affected
  Type: retbleed mitigation: untrained return thunk; SMT enabled with STIBP
    protection
  Type: spec_rstack_overflow mitigation: safe RET
  Type: spec_store_bypass mitigation: Speculative Store Bypass disabled via
    prctl
  Type: spectre_v1 mitigation: usercopy/swapgs barriers and __user pointer
    sanitization
  Type: spectre_v2 mitigation: Retpolines, IBPB: conditional, STIBP:
    always-on, RSB filling, PBRSB-eIBRS: Not affected
  Type: srbds status: Not affected
  Type: tsx_async_abort status: Not affected
Graphics:
  Device-1: NVIDIA TU104 [GeForce RTX 2070 SUPER] vendor: eVga.com.
    driver: nvidia v: 535.104.05 alternate: nouveau,nvidia_drm non-free: 535.xx+
    status: current (as of 2023-08) arch: Turing code: TUxxx
    process: TSMC 12nm FF built: 2018-22 pcie: gen: 2 speed: 5 GT/s lanes: 16
    link-max: gen: 3 speed: 8 GT/s ports: active: none off: DP-1 empty: DP-2,
    DP-3, HDMI-A-1, Unknown-1 bus-ID: 08:00.0 chip-ID: 10de:1e84 class-ID: 0300
  Display: x11 server: X.Org v: 21.1.8 with: Xwayland v: 23.2.0
    compositor: Picom v: git-b700a driver: X: loaded: nvidia
    gpu: nvidia,nvidia-nvswitch display-ID: :0 screens: 1
  Screen-1: 0 s-res: 1920x1080 s-dpi: 91 s-size: 533x300mm (20.98x11.81")
    s-diag: 612mm (24.08")
  Monitor-1: DP-1 mapped: DP-0 note: disabled model: XG248Q serial: <filter>
    built: 2019 res: 1920x1080 dpi: 93 gamma: 1.2 size: 527x296mm (20.75x11.65")
    diag: 604mm (23.8") ratio: 16:9 modes: max: 1920x1080 min: 640x480
  API: OpenGL v: 4.6.0 NVIDIA 535.104.05 renderer: NVIDIA GeForce RTX 2070
    SUPER/PCIe/SSE2 direct-render: Yes
Audio:
  Device-1: NVIDIA TU104 HD Audio vendor: eVga.com. driver: snd_hda_intel
    v: kernel pcie: gen: 3 speed: 8 GT/s lanes: 16 bus-ID: 08:00.1
    chip-ID: 10de:10f8 class-ID: 0403
  Device-2: AMD Starship/Matisse HD Audio vendor: ASUSTeK
    driver: snd_hda_intel v: kernel pcie: gen: 4 speed: 16 GT/s lanes: 16
    bus-ID: 0a:00.4 chip-ID: 1022:1487 class-ID: 0403
  API: ALSA v: k6.4.12-arch1-1 status: kernel-api tools: N/A
  Server-1: sndiod v: N/A status: off tools: aucat,midicat,sndioctl
  Server-2: PipeWire v: 0.3.78 status: active with: 1: pipewire-pulse
    status: active 2: wireplumber status: active 3: pipewire-alsa type: plugin
    4: pw-jack type: plugin tools: pactl,pw-cat,pw-cli,wpctl
Network:
  Device-1: Realtek RTL8111/8168/8411 PCI Express Gigabit Ethernet
    vendor: ASUSTeK PRIME B450M-A driver: r8169 v: kernel pcie: gen: 1
    speed: 2.5 GT/s lanes: 1 port: f000 bus-ID: 04:00.0 chip-ID: 10ec:8168
    class-ID: 0200
  IF: enp4s0 state: up speed: 1000 Mbps duplex: full mac: <filter>
  IF-ID-1: wg-mullvad state: unknown speed: N/A duplex: N/A mac: N/A
Drives:
  Local Storage: total: 953.87 GiB used: 662.66 GiB (69.5%)
  SMART Message: Unable to run smartctl. Root privileges required.
  ID-1: /dev/nvme0n1 maj-min: 259:0 vendor: Intel model: SSDPEKNW010T8
    size: 953.87 GiB block-size: physical: 512 B logical: 512 B speed: 31.6 Gb/s
    lanes: 4 tech: SSD serial: <filter> fw-rev: 002C temp: 37.9 C scheme: GPT
Partition:
  ID-1: / raw-size: 940.27 GiB size: 924.44 GiB (98.32%)
    used: 662.57 GiB (71.7%) fs: ext4 dev: /dev/nvme0n1p4 maj-min: 259:4
  ID-2: /boot/efi raw-size: 99 MiB size: 95 MiB (95.96%)
    used: 95 MiB (100.0%) fs: vfat dev: /dev/nvme0n1p2 maj-min: 259:2
Swap:
  Kernel: swappiness: 60 (default) cache-pressure: 100 (default) zswap: yes
    compressor: zstd max-pool: 20%
  ID-1: swap-1 type: partition size: 12.97 GiB used: 576 KiB (0.0%)
    priority: -2 dev: /dev/nvme0n1p6 maj-min: 259:5
Sensors:
  System Temperatures: cpu: 38.0 C mobo: 35.0 C gpu: nvidia temp: 40 C
  Fan Speeds (rpm): fan-1: 857 fan-2: 1173 fan-3: 837 fan-4: 884 fan-5: 2842
    fan-6: 1706 fan-7: 0 gpu: nvidia fan: 0%
Info:
  Processes: 329 Uptime: 10h 47m wakeups: 2 Memory: total: 16 GiB
  available: 15.53 GiB used: 3.86 GiB (24.9%) Init: systemd v: 254
  default: graphical tool: systemctl Compilers: gcc: 13.2.1 Packages: 1294
  pm: pacman pkgs: 1260 libs: 376 tools: yay pm: flatpak pkgs: 34 Shell: Zsh
  v: 5.9 default: Bash v: 5.1.16 running-in: konsole inxi: 3.3.29

Last edited by Boux (2023-08-31 20:41:23)

Offline

#2 2023-08-31 18:18:59

seth
Member
Registered: 2012-09-03
Posts: 60,683

Re: [SOLVED] Strange behavior in games with "--filter nearest" with xrandr

https://wiki.archlinux.org/title/NVIDIA … en_tearing
Ignore the context, see whether the performance is better when enforcing the composition pipeline.

Offline

#3 2023-08-31 19:08:21

Boux
Member
Registered: 2023-08-31
Posts: 3

Re: [SOLVED] Strange behavior in games with "--filter nearest" with xrandr

seth wrote:

https://wiki.archlinux.org/title/NVIDIA … en_tearing
Ignore the context, see whether the performance is better when enforcing the composition pipeline.

I have tried launching a game (Elden Ring) with this setting and it froze instantly, even without the scaling shenanigans:

nvidia-settings -a CurrentMetaMode="DP-0: 1920x1080_240 {ForceCompositionPipeline=On, ForceFullCompositionPipeline=On}"

I put it back to what it was before and launched the same game and it ran perfectly

nvidia-settings -a CurrentMetaMode="DP-0: 1920x1080_240 {ForceCompositionPipeline=Off, ForceFullCompositionPipeline=Off}"

It freezes in the exact same way that it did with scale 0.5 and --filter nearest, So I'm guessing it's the same issue

Last edited by Boux (2023-08-31 19:09:40)

Offline

#4 2023-08-31 19:37:11

seth
Member
Registered: 2012-09-03
Posts: 60,683

Re: [SOLVED] Strange behavior in games with "--filter nearest" with xrandr

Purely speculative:
/etc/X11/xorg.conf.d/20-nvidia.conf

Section "Device"
    Identifier "My Nvidia GPU"
    Driver  "nvidia"
    Option  "TripleBuffer"          "True"
    Option  "UseNvKmsCompositionPipeline" "false" # https://devtalk.nvidia.com/default/topic/1029484/-various-all-distros-numerous-performance-amp-rendering-issues-on-390-25/?offset=110#reply
EndSection

The critical value is "UseNvKmsCompositionPipeline", but idk. whether the key is still supported or will have an impact

Edit: you'll obviously have to restart the X11 server and check the log on whether the option is accepted at all.

Last edited by seth (2023-08-31 19:37:43)

Offline

#5 2023-08-31 20:39:14

Boux
Member
Registered: 2023-08-31
Posts: 3

Re: [SOLVED] Strange behavior in games with "--filter nearest" with xrandr

seth wrote:

The critical value is "UseNvKmsCompositionPipeline", but idk. whether the key is still supported or will have an impact

Oh my god it works. Thanks man, glad I asked

Offline

Board footer

Powered by FluxBB