You are not logged in.

#1 2025-06-18 20:13:11

lsacienne
Member
Registered: 2025-06-18
Posts: 3

[SOLVED] Hybrid GPU on Arch Linux not working as expected

Hi everyone,

After multiple years spent to use Manjaro Linux, I finally decided to start using Arch Linux. I chose to use Arch because I love the package manager, the DIY experience which help me to understand operating system theory and especially linux. To be honest, I really like spending time to debug my machine and in most cases, I find a solution within few days.

However, I installed Arch linux a month ago and I still cannot figure out how to  make my Hybrid GPU configuration work. I'm facing a really strange problem where I can use vkcube like normal in some cases but in other cases, it just hangs without starting. Most of the time, when the the OS starts, I can use vkcube but then after some time, it just stop to works. It also affect vscode or zeditor launch. I read all the NVIDIA, NVIDIA Optimus, PRIME, Hybrid GPU and even bumblebee page. But I just cannot find a way to use my 2 GPU like normal in a hybrid configuration.

Here, you have my configuration :

  • PC Model: LAPTOP Predator Triton 300 SE (Acer Predator PT314-51s)

  • Integrated GPU: TigerLake-LP GT2 [Iris Xe Graphics]

  • Dedicated GPU: GA106M [GeForce RTX 3060 Mobile / Max-Q]

  • Kernel: Linux 6.15.2-arch1-1

  • DE: Gnome with Wayland (Mutter compositor)

Here you have the result of pacman -Qs intel :

local/intel-compute-runtime 25.18.33578.6-1
    Intel(R) Graphics Compute Runtime for oneAPI Level Zero and OpenCL(TM)
    Driver
local/intel-gmmlib 22.7.2-1
    Intel Graphics Memory Management Library
local/intel-graphics-compiler 1:2.11.7-1
    Intel Graphics Compiler for OpenCL
local/intel-media-driver 25.2.3-1
    Intel Media Driver for VAAPI — Broadwell+ iGPUs
local/intel-oneapi-common 2024.1.0-1
    Intel's oneAPI common variables and licensing
local/intel-oneapi-compiler-dpcpp-cpp-runtime-libs 2025.0.4-1
    Intel oneAPI Data Parallel C++ compiler: Minimal runtime libraries
local/intel-oneapi-compiler-shared-runtime-libs 2025.0.4-2
    Intel oneAPI compiler runtime libraries: Minimal compiler libraries
local/intel-oneapi-openmp 2025.0.4-1
    Intel oneAPI OpenMP runtime library
local/intel-oneapi-tbb 2021.12.0-2
    Intel oneAPI Threading Building Blocks
local/intel-oneapi-tcm 1.2.0-3
    Thread Composability Manager
local/intel-ucode 20250512-1
    Microcode update files for Intel CPUs
local/libmfx 23.2.2-4
    Intel Media SDK dispatcher library
local/libvpl 2.15.0-1
    Intel Video Processing Library
local/nvtop 3.2.0-1
    GPUs process monitoring for AMD, Intel and NVIDIA
local/onetbb 2022.1.0-1
    oneAPI Threading Building Blocks - a high level abstract threading library
local/openimagedenoise 2.3.3-2
    Intel(R) Open Image Denoise library
local/openpgl 0.7.0-1
    Intel Open Path Guiding Library
local/vulkan-intel 1:25.1.3-3
    Open-source Vulkan driver for Intel GPUs

Here you have the result of pacman -Qs nvidia

local/egl-gbm 1.1.2.1-1
    The GBM EGL external platform library
local/egl-wayland 4:1.1.19-1
    EGLStream-based Wayland external platform
local/egl-x11 1.0.2-1
    NVIDIA XLib and XCB EGL Platform Library
local/envycontrol 3.5.2-1
    CLI tool for Nvidia Optimus graphics mode switching on Linux
local/lib32-nvidia-utils 575.57.08-1
    NVIDIA drivers utilities (32-bit)
local/libvdpau 1.5-3
    Nvidia VDPAU library
local/libxnvctrl 575.57.08-1
    NVIDIA NV-CONTROL X extension
local/nvidia-open 575.57.08-5
    NVIDIA open kernel modules
local/nvidia-prime 1.0-5
    NVIDIA Prime Render Offload configuration and utilities
local/nvidia-settings 575.57.08-1
    Tool for configuring the NVIDIA graphics driver
local/nvidia-utils 575.57.08-3
    NVIDIA drivers utilities
local/nvtop 3.2.0-1
    GPUs process monitoring for AMD, Intel and NVIDIA

Here you have some of my logs that arrived around the moment where I cannot start vkcube anymore :

juin 18 21:20:02 predator systemd[1250]: Started Application launched by gnome-shell.
juin 18 21:20:03 predator systemd[1]: Starting Time & Date Service...
juin 18 21:20:03 predator systemd[1]: Started Time & Date Service.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 7 threads of 4 processes of 1 users.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 7 threads of 4 processes of 1 users.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 7 threads of 4 processes of 1 users.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 7 threads of 4 processes of 1 users.
juin 18 21:20:03 predator rtkit-daemon[832]: Successfully made thread 2202 of process 2066 owned by '1000' RT at priority 10.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:03 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:04 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:04 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:04 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:04 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:04 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:04 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:05 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:05 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:20:05 predator rtkit-daemon[832]: Successfully made thread 2492 of process 2432 owned by '1000' RT at priority 10.
juin 18 21:20:05 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:20:08 predator systemd[1]: systemd-localed.service: Deactivated successfully.
juin 18 21:20:08 predator systemd[1]: systemd-hostnamed.service: Deactivated successfully.
juin 18 21:20:23 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:20:23 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:20:24 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:20:24 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:20:25 predator wpa_supplicant[696]: wlan0: CTRL-EVENT-SIGNAL-CHANGE above=1 signal=-54 noise=9999 txrate=866700
juin 18 21:20:32 predator geoclue[1098]: Service not used for 60 seconds. Shutting down..
juin 18 21:20:32 predator systemd[1]: geoclue.service: Deactivated successfully.
juin 18 21:20:33 predator systemd[1]: systemd-timedated.service: Deactivated successfully.
juin 18 21:20:45 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:20:45 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:20:45 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:20:45 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:20:50 predator firefox.desktop[2066]: (as)
juin 18 21:20:50 predator gnome-shell[1350]: Received error from D-Bus search provider org.gnome.Boxes.desktop: Gio.IOErrorEnum: Impossible d’appeler la méthode ; le serveur est mandataire d’un nom connu org.gn>
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Calculator.SearchProvider.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Calendar.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Characters.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Contacts.SearchProvider.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Epiphany.SearchProvider.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Games.SearchProvider.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Nautilus.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Notes.SearchProvider.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Photos.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Recipes.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.Settings.SearchProvider.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.clocks.
juin 18 21:20:50 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.seahorse.Application.
juin 18 21:20:50 predator systemd[1250]: Started dbus-:1.2-org.gnome.Calculator.SearchProvider@0.service.
juin 18 21:20:50 predator systemd[1250]: Started dbus-:1.2-org.gnome.Calendar@0.service.
juin 18 21:20:50 predator systemd[1250]: Started dbus-:1.2-org.gnome.Characters@0.service.
juin 18 21:20:50 predator systemd[1250]: Started dbus-:1.2-org.gnome.Contacts.SearchProvider@0.service.
juin 18 21:20:50 predator systemd[1250]: Started dbus-:1.2-org.gnome.Epiphany.SearchProvider@0.service.
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-org.gnome.Games.SearchProvider@0.service.
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-org.gnome.Nautilus@0.service.
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-org.gnome.Notes.SearchProvider@0.service.
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-org.gnome.Photos@0.service.
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-org.gnome.Recipes@0.service.
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-org.gnome.Settings.SearchProvider@0.service.
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-org.gnome.clocks@0.service.
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-org.gnome.seahorse.Application@0.service.
juin 18 21:20:51 predator gnome-games-sea[2808]: search-provider.vala:213: Couldn’t open the database for /home/lsacienne/.local/share/gnome-games/database.sqlite3
juin 18 21:20:51 predator gnome-shell[1350]: Received error from D-Bus search provider org.gnome.Boxes.desktop: Gio.IOErrorEnum: Impossible d’appeler la méthode ; le serveur est mandataire d’un nom connu org.gn>
juin 18 21:20:51 predator firefox.desktop[2066]: (asas)
juin 18 21:20:51 predator systemd[1]: Starting Time & Date Service...
juin 18 21:20:51 predator nautilus[2809]: Connecting to org.freedesktop.Tracker3.Miner.Files
juin 18 21:20:51 predator systemd[1250]: Starting Bluetooth OBEX service...
juin 18 21:20:51 predator obexd[2902]: OBEX daemon 5.83
juin 18 21:20:51 predator systemd[1250]: Started Bluetooth OBEX service.
juin 18 21:20:51 predator systemd[1]: Started Time & Date Service.
juin 18 21:20:51 predator firefox.desktop[2066]: (asas)
juin 18 21:20:51 predator gnome-shell[1350]: Received error from D-Bus search provider org.gnome.Boxes.desktop: Gio.IOErrorEnum: Impossible d’appeler la méthode ; le serveur est mandataire d’un nom connu org.gn>
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-org.gnome.Recipes@1.service.
juin 18 21:20:51 predator gnome-character[2805]: JS LOG: Characters Application started
juin 18 21:20:51 predator gnome-calendar[2804]: Running Calendar as a service
juin 18 21:20:51 predator systemd[1250]: Created slice User Background Tasks Slice.
juin 18 21:20:51 predator systemd[1250]: Starting Tracker file system data miner...
juin 18 21:20:51 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-org.gnome.NautilusPreviewer.
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-org.gnome.NautilusPreviewer@0.service.
juin 18 21:20:51 predator systemd[1250]: Started GnuPG public key management service.
juin 18 21:20:51 predator keyboxd[2981]: keyboxd (GnuPG) 2.4.7 starting in supervised mode.
juin 18 21:20:51 predator keyboxd[2981]: database version: 1
juin 18 21:20:51 predator keyboxd[2981]: database created: 2025-05-18 15:07:24
juin 18 21:20:51 predator keyboxd[2981]: database '/home/lsacienne/.gnupg/public-keys.d/pubring.db' created
juin 18 21:20:51 predator systemd[1250]: Started GnuPG cryptographic agent and passphrase cache.
juin 18 21:20:51 predator gnome-shell[1350]: Received error from D-Bus search provider org.gnome.Recipes.desktop during GetResultMetas: Gio.DBusError: GDBus.Error:org.freedesktop.DBus.Error.NoReply: Remote peer>
juin 18 21:20:51 predator gpg-agent[3005]: gpg-agent (GnuPG) 2.4.7 starting in supervised mode.
juin 18 21:20:51 predator gpg-agent[3005]: using fd 3 for ssh socket (/run/user/1000/gnupg/S.gpg-agent.ssh)
juin 18 21:20:51 predator gpg-agent[3005]: using fd 4 for extra socket (/run/user/1000/gnupg/S.gpg-agent.extra)
juin 18 21:20:51 predator gpg-agent[3005]: using fd 5 for browser socket (/run/user/1000/gnupg/S.gpg-agent.browser)
juin 18 21:20:51 predator gpg-agent[3005]: using fd 6 for std socket (/run/user/1000/gnupg/S.gpg-agent)
juin 18 21:20:51 predator gpg-agent[3005]: listening on: std=6 extra=4 browser=5 ssh=3
juin 18 21:20:51 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:20:51 predator rtkit-daemon[832]: Failed to look up client: No such file or directory
juin 18 21:20:51 predator systemd[1250]: Started Tracker file system data miner.
juin 18 21:20:51 predator systemd[1250]: Created slice Slice /app/dbus-:1.2-com.intel.dleyna\x2drenderer.
juin 18 21:20:51 predator systemd[1250]: Started dbus-:1.2-com.intel.dleyna\x2drenderer@0.service.
juin 18 21:20:51 predator dleyna-renderer-service[3035]: dLeyna core version 0.8.3
juin 18 21:20:51 predator dleyna-renderer-service[3035]: dleyna-renderer-service version 0.8.3
juin 18 21:20:51 predator kernel: warning: `dleyna-renderer' uses wireless extensions which will stop working for Wi-Fi 7 hardware; use nl80211
juin 18 21:20:51 predator systemd[1250]: Started Application launched by gnome-shell.
juin 18 21:20:52 predator dleyna-renderer-service[3035]: dLeyna: Exit
juin 18 21:21:01 predator gnome-character[2805]: JS LOG: Characters Application exiting
juin 18 21:21:01 predator systemd[1250]: dbus-:1.2-org.gnome.NautilusPreviewer@0.service: Consumed 300ms CPU time, 66.8M memory peak.
juin 18 21:21:03 predator bijiben-shell-search-provider[2810]: Unable to load location /home/lsacienne/.local/share/bijiben: Error opening directory '/home/lsacienne/.local/share/bijiben': Aucun fichier ou doss>
juin 18 21:21:11 predator gnome-keyring-daemon[1266]: asked to register item /org/freedesktop/secrets/collection/login/2, but it's already registered
juin 18 21:21:11 predator gnome-keyring-d[1266]: asked to register item /org/freedesktop/secrets/collection/login/2, but it's already registered
juin 18 21:21:15 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:21:15 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:21:15 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:21:15 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:21:15 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:21:15 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:21:16 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:21:16 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:21:21 predator systemd[1]: systemd-timedated.service: Deactivated successfully.
juin 18 21:21:35 predator iwd[546]: event: connect-info, ssid: SFR_164E, bss: b0:b3:69:9d:16:51, signal: -50, load: 0/255
juin 18 21:21:35 predator iwd[546]: event: state, old: autoconnect_full, new: connecting (auto)
juin 18 21:21:35 predator iwd[546]: event: connect-failed, status: 1
juin 18 21:21:35 predator iwd[546]: event: connect-info, ssid: SFR_164E, bss: b0:b3:69:9d:16:50, signal: -42, load: 0/255
juin 18 21:21:35 predator iwd[546]: event: connect-failed, status: 1
juin 18 21:21:35 predator iwd[546]: event: state, old: connecting (auto), new: disconnected
juin 18 21:21:35 predator iwd[546]: event: state, old: disconnected, new: autoconnect_full
juin 18 21:21:35 predator wpa_supplicant[696]: wlan0: CTRL-EVENT-SIGNAL-CHANGE above=1 signal=-53 noise=9999 txrate=866700
juin 18 21:21:41 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:21:41 predator rtkit-daemon[832]: Supervising 9 threads of 6 processes of 1 users.
juin 18 21:21:51 predator epiphany-search[2807]: broker.vala:159: Error loading plugin: libvoikko.so.1: Ne peut ouvrir le fichier d'objet partagé: Aucun fichier ou dossier de ce nom
juin 18 21:21:51 predator epiphany-search[2807]: broker.vala:159: Error loading plugin: libhspell.so.0: Ne peut ouvrir le fichier d'objet partagé: Aucun fichier ou dossier de ce nom
juin 18 21:21:51 predator epiphany-search[2807]: broker.vala:159: Error loading plugin: libnuspell.so.5: Ne peut ouvrir le fichier d'objet partagé: Aucun fichier ou dossier de ce nom
juin 18 21:21:51 predator epiphany-search[2807]: broker.vala:159: Error loading plugin: libaspell.so.15: Ne peut ouvrir le fichier d'objet partagé: Aucun fichier ou dossier de ce nom
juin 18 21:21:51 predator systemd[1250]: dbus-:1.2-org.gnome.Epiphany.SearchProvider@0.service: Consumed 432ms CPU time, 87.4M memory peak.
juin 18 21:22:16 predator kernel: acer_wmi: Unknown function number - 9 - 0
juin 18 21:22:17 predator kernel: NVRM: gpuWaitForGfwBootComplete_TU102: failed to wait for GFW_BOOT: (progress 0x9)
juin 18 21:22:17 predator kernel: NVRM: kgspWaitForGfwBootOk_TU102: failed to wait for GFW boot complete: 0x55 VBIOS version 94.06.16.00.5E
juin 18 21:22:17 predator kernel: NVRM: kgspWaitForGfwBootOk_TU102: (the GPU may be in a bad state and may need to be reset)
juin 18 21:22:23 predator kernel: NVRM: _kgspLogXid119: ********************************* GSP Timeout **********************************
juin 18 21:22:23 predator kernel: NVRM: _kgspLogXid119: Note: Please also check logs above.
juin 18 21:22:23 predator kernel: NVRM: GPU at PCI:0000:01:00: GPU-f451892d-a150-9928-a04a-9712ec218877
juin 18 21:22:23 predator kernel: NVRM: Xid (PCI:0000:01:00): 119, pid=117, name=kworker/7:1, Timeout after 6s of waiting for RPC response from GPU0 GSP! Expected function 76 (GSP_RM_CONTROL) (0x2080205b 0x4).
juin 18 21:22:23 predator kernel: NVRM: GPU0 GSP RPC buffer contains function 76 (GSP_RM_CONTROL) and data 0x000000002080205b 0x0000000000000004.
juin 18 21:22:23 predator kernel: NVRM: GPU0 RPC history (CPU -> GSP):
juin 18 21:22:23 predator kernel: NVRM:     entry function                   data0              data1              ts_start           ts_end             duration actively_polling
juin 18 21:22:23 predator kernel: NVRM:      0    76   GSP_RM_CONTROL        0x000000002080205b 0x0000000000000004 0x000637dd8c134044 0x0000000000000000          y
juin 18 21:22:23 predator kernel: NVRM:     -1    47   UNLOADING_GUEST_DRIVE 0x0000000000000000 0x0000000000000000 0x000637dd8bde8f7e 0x000637dd8be3bd43 339397us  
juin 18 21:22:23 predator kernel: NVRM:     -2    10   FREE                  0x00000000c1e0007d 0x0000000000000000 0x000637dd8bde8d13 0x000637dd8bde8f58    581us  
juin 18 21:22:23 predator kernel: NVRM:     -3    10   FREE                  0x000000000000000b 0x0000000000000000 0x000637dd8bde8a59 0x000637dd8bde8d11    696us  
juin 18 21:22:23 predator kernel: NVRM:     -4    10   FREE                  0x000000000000000c 0x0000000000000000 0x000637dd8bde8822 0x000637dd8bde89b2    400us  
juin 18 21:22:23 predator kernel: NVRM:     -5    10   FREE                  0x0000000000000006 0x0000000000000000 0x000637dd8bde858f 0x000637dd8bde881c    653us  
juin 18 21:22:23 predator kernel: NVRM:     -6    10   FREE                  0x000000000000000a 0x0000000000000000 0x000637dd8bde7f94 0x000637dd8bde858a   1526us  
juin 18 21:22:23 predator kernel: NVRM:     -7    10   FREE                  0x0000000000000002 0x0000000000000000 0x000637dd8bde7107 0x000637dd8bde7e9e   3479us  
juin 18 21:22:23 predator kernel: NVRM: GPU0 RPC event history (CPU <- GSP):
juin 18 21:22:23 predator kernel: NVRM:     entry function                   data0              data1              ts_start           ts_end             duration during_incomplete_rpc
juin 18 21:22:23 predator kernel: NVRM:      0    4108 UCODE_LIBOS_PRINT     0x0000000000000000 0x0000000000000000 0x000637dd8be0faf8 0x000637dd8be0faf9      1us  
juin 18 21:22:23 predator kernel: NVRM:     -1    4128 GSP_POST_NOCAT_RECORD 0x0000000000000002 0x0000000000000028 0x000637dd8bdee255 0x000637dd8bdee258      3us  
juin 18 21:22:23 predator kernel: NVRM:     -2    4111 PERF_BRIDGELESS_INFO_ 0x0000000000000000 0x0000000000000000 0x000637dd8bdee0c6 0x000637dd8bdee0c6           
juin 18 21:22:23 predator kernel: NVRM:     -3    4108 UCODE_LIBOS_PRINT     0x0000000000000000 0x0000000000000000 0x000637dd8aa554fc 0x000637dd8aa554fc           
juin 18 21:22:23 predator kernel: NVRM:     -4    4108 UCODE_LIBOS_PRINT     0x0000000000000000 0x0000000000000000 0x000637dd8aa5539a 0x000637dd8aa5539b      1us  
juin 18 21:22:23 predator kernel: NVRM:     -5    4128 GSP_POST_NOCAT_RECORD 0x0000000000000002 0x0000000000000027 0x000637dd8aa5412a 0x000637dd8aa5412c      2us  
juin 18 21:22:23 predator kernel: NVRM:     -6    4098 GSP_RUN_CPU_SEQUENCER 0x000000000000061c 0x0000000000003fe2 0x000637dd8aa4847c 0x000637dd8aa497c4   4936us  
juin 18 21:22:23 predator kernel: NVRM:     -7    4108 UCODE_LIBOS_PRINT     0x0000000000000000 0x0000000000000000 0x000637dd89870f20 0x000637dd89870f21      1us  
juin 18 21:22:23 predator kernel: CPU: 7 UID: 0 PID: 117 Comm: kworker/7:1 Tainted: G     U     OE       6.15.2-arch1-1 #1 PREEMPT(full)  806378c57c3c21a60e39b7d20019ada706b7af8b
juin 18 21:22:23 predator kernel: Tainted: [U]=USER, [O]=OOT_MODULE, [E]=UNSIGNED_MODULE
juin 18 21:22:23 predator kernel: Hardware name: Acer Predator PT314-51s/Clubman_TLM, BIOS V1.10 12/01/2021
juin 18 21:22:23 predator kernel: Workqueue: kacpi_notify acpi_os_execute_deferred
juin 18 21:22:23 predator kernel: Call Trace:
juin 18 21:22:23 predator kernel:  <TASK>
juin 18 21:22:23 predator kernel:  dump_stack_lvl+0x5d/0x80
juin 18 21:22:23 predator kernel:  _kgspRpcRecvPoll+0x5b5/0x800 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  _issueRpcAndWait+0xc2/0x920 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? osGetCurrentThread+0x26/0x60 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? _rmGpuLockIsOwner+0x24/0x90 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  rpcRmApiControl_GSP+0x274/0x960 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  subdeviceCtrlCmdPerfSetPowerstate_KERNEL+0xaa/0x1c0 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  resControl_IMPL+0x1a5/0x1b0 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  serverControl+0x48a/0x590 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  _rmapiRmControl+0x598/0x980 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? acpi_os_release_object+0xe/0x20
juin 18 21:22:23 predator kernel:  rmapiControlWithSecInfo+0x79/0x140 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  rmapiControl+0x21/0x40 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  RmPowerSourceChangeEvent+0x56/0x70 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  RmPowerManagement+0x1c3/0x1cc [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? rm_transition_dynamic_power+0x53/0x13d [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  RmGcxPowerManagement+0x21d/0x3a0 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? _rmGpuLockIsOwner+0x24/0x90 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  rm_transition_dynamic_power+0x8a/0x13d [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? __pci_find_next_ht_cap+0x59/0xe0
juin 18 21:22:23 predator kernel:  ? __pfx_pci_pm_runtime_resume+0x10/0x10
juin 18 21:22:23 predator kernel:  ? __pfx_pci_pm_runtime_resume+0x10/0x10
juin 18 21:22:23 predator kernel:  nv_pmops_runtime_resume+0x65/0xf0 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  __rpm_callback+0x45/0x1f0
juin 18 21:22:23 predator kernel:  ? os_alloc_mem+0x104/0x120 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? __pfx_pci_pm_runtime_resume+0x10/0x10
juin 18 21:22:23 predator kernel:  rpm_callback+0x6d/0x80
juin 18 21:22:23 predator kernel:  rpm_resume+0x4af/0x6d0
juin 18 21:22:23 predator kernel:  ? _portMemAllocNonPagedUntracked+0x2c/0x40 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  __pm_runtime_resume+0x52/0x90
juin 18 21:22:23 predator kernel:  pci_device_shutdown+0x1b/0x60
juin 18 21:22:23 predator kernel:  nv_indicate_not_idle+0x2b/0x40 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  os_ref_dynamic_power+0x146/0x220 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  RmUnixRmApiPrologue+0x4b/0x90 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  rm_acpi_notify+0xc4/0x280 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? acpi_os_release_object+0xe/0x20
juin 18 21:22:23 predator kernel:  acpi_ev_notify_dispatch+0x4b/0x70
juin 18 21:22:23 predator kernel:  acpi_os_execute_deferred+0x17/0x30
juin 18 21:22:23 predator kernel:  process_one_work+0x193/0x350
juin 18 21:21:51 predator systemd[1250]: dbus-:1.2-org.gnome.Epiphany.SearchProvider@0.service: Consumed 432ms CPU time, 87.4M memory peak.
juin 18 21:22:16 predator kernel: acer_wmi: Unknown function number - 9 - 0
juin 18 21:22:17 predator kernel: NVRM: gpuWaitForGfwBootComplete_TU102: failed to wait for GFW_BOOT: (progress 0x9)
juin 18 21:22:17 predator kernel: NVRM: kgspWaitForGfwBootOk_TU102: failed to wait for GFW boot complete: 0x55 VBIOS version 94.06.16.00.5E
juin 18 21:22:17 predator kernel: NVRM: kgspWaitForGfwBootOk_TU102: (the GPU may be in a bad state and may need to be reset)
juin 18 21:22:23 predator kernel: NVRM: _kgspLogXid119: ********************************* GSP Timeout **********************************
juin 18 21:22:23 predator kernel: NVRM: _kgspLogXid119: Note: Please also check logs above.
juin 18 21:22:23 predator kernel: NVRM: GPU at PCI:0000:01:00: GPU-f451892d-a150-9928-a04a-9712ec218877
juin 18 21:22:23 predator kernel: NVRM: Xid (PCI:0000:01:00): 119, pid=117, name=kworker/7:1, Timeout after 6s of waiting for RPC response from GPU0 GSP! Expected function 76 (GSP_RM_CONTROL) (0x2080205b 0x4).
juin 18 21:22:23 predator kernel: NVRM: GPU0 GSP RPC buffer contains function 76 (GSP_RM_CONTROL) and data 0x000000002080205b 0x0000000000000004.
juin 18 21:22:23 predator kernel: NVRM: GPU0 RPC history (CPU -> GSP):
juin 18 21:22:23 predator kernel: NVRM:     entry function                   data0              data1              ts_start           ts_end             duration actively_polling
juin 18 21:22:23 predator kernel: NVRM:      0    76   GSP_RM_CONTROL        0x000000002080205b 0x0000000000000004 0x000637dd8c134044 0x0000000000000000          y
juin 18 21:22:23 predator kernel: NVRM:     -1    47   UNLOADING_GUEST_DRIVE 0x0000000000000000 0x0000000000000000 0x000637dd8bde8f7e 0x000637dd8be3bd43 339397us  
juin 18 21:22:23 predator kernel: NVRM:     -2    10   FREE                  0x00000000c1e0007d 0x0000000000000000 0x000637dd8bde8d13 0x000637dd8bde8f58    581us  
juin 18 21:22:23 predator kernel: NVRM:     -3    10   FREE                  0x000000000000000b 0x0000000000000000 0x000637dd8bde8a59 0x000637dd8bde8d11    696us  
juin 18 21:22:23 predator kernel: NVRM:     -4    10   FREE                  0x000000000000000c 0x0000000000000000 0x000637dd8bde8822 0x000637dd8bde89b2    400us  
juin 18 21:22:23 predator kernel: NVRM:     -5    10   FREE                  0x0000000000000006 0x0000000000000000 0x000637dd8bde858f 0x000637dd8bde881c    653us  
juin 18 21:22:23 predator kernel: NVRM:     -6    10   FREE                  0x000000000000000a 0x0000000000000000 0x000637dd8bde7f94 0x000637dd8bde858a   1526us  
juin 18 21:22:23 predator kernel: NVRM:     -7    10   FREE                  0x0000000000000002 0x0000000000000000 0x000637dd8bde7107 0x000637dd8bde7e9e   3479us  
juin 18 21:22:23 predator kernel: NVRM: GPU0 RPC event history (CPU <- GSP):
juin 18 21:22:23 predator kernel: NVRM:     entry function                   data0              data1              ts_start           ts_end             duration during_incomplete_rpc
juin 18 21:22:23 predator kernel: NVRM:      0    4108 UCODE_LIBOS_PRINT     0x0000000000000000 0x0000000000000000 0x000637dd8be0faf8 0x000637dd8be0faf9      1us  
juin 18 21:22:23 predator kernel: NVRM:     -1    4128 GSP_POST_NOCAT_RECORD 0x0000000000000002 0x0000000000000028 0x000637dd8bdee255 0x000637dd8bdee258      3us  
juin 18 21:22:23 predator kernel: NVRM:     -2    4111 PERF_BRIDGELESS_INFO_ 0x0000000000000000 0x0000000000000000 0x000637dd8bdee0c6 0x000637dd8bdee0c6           
juin 18 21:22:23 predator kernel: NVRM:     -3    4108 UCODE_LIBOS_PRINT     0x0000000000000000 0x0000000000000000 0x000637dd8aa554fc 0x000637dd8aa554fc           
juin 18 21:22:23 predator kernel: NVRM:     -4    4108 UCODE_LIBOS_PRINT     0x0000000000000000 0x0000000000000000 0x000637dd8aa5539a 0x000637dd8aa5539b      1us  
juin 18 21:22:23 predator kernel: NVRM:     -5    4128 GSP_POST_NOCAT_RECORD 0x0000000000000002 0x0000000000000027 0x000637dd8aa5412a 0x000637dd8aa5412c      2us  
juin 18 21:22:23 predator kernel: NVRM:     -6    4098 GSP_RUN_CPU_SEQUENCER 0x000000000000061c 0x0000000000003fe2 0x000637dd8aa4847c 0x000637dd8aa497c4   4936us  
juin 18 21:22:23 predator kernel: NVRM:     -7    4108 UCODE_LIBOS_PRINT     0x0000000000000000 0x0000000000000000 0x000637dd89870f20 0x000637dd89870f21      1us  
juin 18 21:22:23 predator kernel: CPU: 7 UID: 0 PID: 117 Comm: kworker/7:1 Tainted: G     U     OE       6.15.2-arch1-1 #1 PREEMPT(full)  806378c57c3c21a60e39b7d20019ada706b7af8b
juin 18 21:22:23 predator kernel: Tainted: [U]=USER, [O]=OOT_MODULE, [E]=UNSIGNED_MODULE
juin 18 21:22:23 predator kernel: Hardware name: Acer Predator PT314-51s/Clubman_TLM, BIOS V1.10 12/01/2021
juin 18 21:22:23 predator kernel: Workqueue: kacpi_notify acpi_os_execute_deferred
juin 18 21:22:23 predator kernel: Call Trace:
juin 18 21:22:23 predator kernel:  <TASK>
juin 18 21:22:23 predator kernel:  dump_stack_lvl+0x5d/0x80
juin 18 21:22:23 predator kernel:  _kgspRpcRecvPoll+0x5b5/0x800 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  _issueRpcAndWait+0xc2/0x920 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? osGetCurrentThread+0x26/0x60 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? _rmGpuLockIsOwner+0x24/0x90 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  rpcRmApiControl_GSP+0x274/0x960 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  subdeviceCtrlCmdPerfSetPowerstate_KERNEL+0xaa/0x1c0 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  resControl_IMPL+0x1a5/0x1b0 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  serverControl+0x48a/0x590 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  _rmapiRmControl+0x598/0x980 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? acpi_os_release_object+0xe/0x20
juin 18 21:22:23 predator kernel:  rmapiControlWithSecInfo+0x79/0x140 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  rmapiControl+0x21/0x40 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  RmPowerSourceChangeEvent+0x56/0x70 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  RmPowerManagement+0x1c3/0x1cc [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? rm_transition_dynamic_power+0x53/0x13d [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  RmGcxPowerManagement+0x21d/0x3a0 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? _rmGpuLockIsOwner+0x24/0x90 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  rm_transition_dynamic_power+0x8a/0x13d [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? __pci_find_next_ht_cap+0x59/0xe0
juin 18 21:22:23 predator kernel:  ? __pfx_pci_pm_runtime_resume+0x10/0x10
juin 18 21:22:23 predator kernel:  ? __pfx_pci_pm_runtime_resume+0x10/0x10
juin 18 21:22:23 predator kernel:  nv_pmops_runtime_resume+0x65/0xf0 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  __rpm_callback+0x45/0x1f0
juin 18 21:22:23 predator kernel:  ? os_alloc_mem+0x104/0x120 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? __pfx_pci_pm_runtime_resume+0x10/0x10
juin 18 21:22:23 predator kernel:  rpm_callback+0x6d/0x80
juin 18 21:22:23 predator kernel:  rpm_resume+0x4af/0x6d0
juin 18 21:22:23 predator kernel:  ? _portMemAllocNonPagedUntracked+0x2c/0x40 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  __pm_runtime_resume+0x52/0x90
juin 18 21:22:23 predator kernel:  pci_device_shutdown+0x1b/0x60
juin 18 21:22:23 predator kernel:  nv_indicate_not_idle+0x2b/0x40 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  os_ref_dynamic_power+0x146/0x220 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  RmUnixRmApiPrologue+0x4b/0x90 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  rm_acpi_notify+0xc4/0x280 [nvidia 4fe42d8f813b61ee64750a1da698001c8a512ef9]
juin 18 21:22:23 predator kernel:  ? acpi_os_release_object+0xe/0x20
juin 18 21:22:23 predator kernel:  acpi_ev_notify_dispatch+0x4b/0x70
juin 18 21:22:23 predator kernel:  acpi_os_execute_deferred+0x17/0x30
juin 18 21:22:23 predator kernel:  process_one_work+0x193/0x350
juin 18 21:22:23 predator kernel:  worker_thread+0x2d7/0x410
juin 18 21:22:23 predator kernel:  ? __pfx_worker_thread+0x10/0x10
juin 18 21:22:23 predator kernel:  kthread+0xfc/0x240
juin 18 21:22:23 predator kernel:  ? __pfx_kthread+0x10/0x10
juin 18 21:22:23 predator kernel:  ret_from_fork+0x31/0x50
juin 18 21:22:23 predator kernel:  ? __pfx_kthread+0x10/0x10
juin 18 21:22:23 predator kernel:  ret_from_fork_asm+0x1a/0x30
juin 18 21:22:23 predator kernel:  </TASK>
juin 18 21:22:23 predator kernel: NVRM: _kgspLogXid119: ********************************************************************************
juin 18 21:22:23 predator kernel: NVRM: _issueRpcAndWait: rpcRecvPoll timedout for fn 76!
juin 18 21:22:23 predator kernel: NVRM: subdeviceCtrlCmdPerfSetPowerstate_KERNEL: NV2080_CTRL_CMD_PERF_SET_POWERSTATE RPC failed
juin 18 21:22:29 predator kernel: NVRM: Xid (PCI:0000:01:00): 119, pid=117, name=kworker/7:1, Timeout after 6s of waiting for RPC response from GPU0 GSP! Expected function 76 (GSP_RM_CONTROL) (0x20800a81 0x4).
juin 18 21:22:29 predator kernel: NVRM: _issueRpcAndWait: rpcRecvPoll timedout for fn 76!
juin 18 21:22:29 predator kernel: NVRM: _kperfSendPostPowerStateCallback: Error getting Aux Power State:0x65
juin 18 21:22:32 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:32 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:35 predator kernel: NVRM: Xid (PCI:0000:01:00): 119, pid=117, name=kworker/7:1, Timeout after 6s of waiting for RPC response from GPU0 GSP! Expected function 76 (GSP_RM_CONTROL) (0x20802092 0x4).
juin 18 21:22:35 predator kernel: NVRM: nvAssertFailedNoLog: Assertion failed: Back to back GSP RPC timeout detected! GPU marked for reset @ kernel_gsp.c:2314
juin 18 21:22:35 predator kernel: NVRM: _issueRpcAndWait: rpcRecvPoll timedout for fn 76!
juin 18 21:22:35 predator kernel: NVRM: nvAssertFailedNoLog: Assertion failed: 0 @ kern_perf_pwr.c:191
juin 18 21:22:41 predator kernel: NVRM: Rate limiting GSP RPC error prints for GPU at PCI:0000:01:00 (printing 1 of every 30).  The GPU likely needs to be reset.
juin 18 21:22:41 predator kernel: NVRM: _kperfSendPostPowerStateCallback: Error getting Aux Power State:0x65
juin 18 21:22:41 predator kernel: NVRM: RmHandleDNotifierEvent: RmHandleDNotifierEvent: Failed to handle ACPI D-Notifier event, status=0x65
juin 18 21:22:29 predator kernel: NVRM: Xid (PCI:0000:01:00): 119, pid=117, name=kworker/7:1, Timeout after 6s of waiting for RPC response from GPU0 GSP! Expected function 76 (GSP_RM_CONTROL) (0x20800a81 0x4).
juin 18 21:22:29 predator kernel: NVRM: _issueRpcAndWait: rpcRecvPoll timedout for fn 76!
juin 18 21:22:29 predator kernel: NVRM: _kperfSendPostPowerStateCallback: Error getting Aux Power State:0x65
juin 18 21:22:32 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:32 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:35 predator kernel: NVRM: Xid (PCI:0000:01:00): 119, pid=117, name=kworker/7:1, Timeout after 6s of waiting for RPC response from GPU0 GSP! Expected function 76 (GSP_RM_CONTROL) (0x20802092 0x4).
juin 18 21:22:35 predator kernel: NVRM: nvAssertFailedNoLog: Assertion failed: Back to back GSP RPC timeout detected! GPU marked for reset @ kernel_gsp.c:2314
juin 18 21:22:35 predator kernel: NVRM: _issueRpcAndWait: rpcRecvPoll timedout for fn 76!
juin 18 21:22:35 predator kernel: NVRM: nvAssertFailedNoLog: Assertion failed: 0 @ kern_perf_pwr.c:191
juin 18 21:22:41 predator kernel: NVRM: Rate limiting GSP RPC error prints for GPU at PCI:0000:01:00 (printing 1 of every 30).  The GPU likely needs to be reset.
juin 18 21:22:41 predator kernel: NVRM: _kperfSendPostPowerStateCallback: Error getting Aux Power State:0x65
juin 18 21:22:41 predator kernel: NVRM: RmHandleDNotifierEvent: RmHandleDNotifierEvent: Failed to handle ACPI D-Notifier event, status=0x65
juin 18 21:22:44 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:44 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:44 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:44 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:47 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65
juin 18 21:22:53 predator kernel: NVRM: nvCheckOkFailedNoLog: Check failed: Call timed out [NV_ERR_TIMEOUT] (0x00000065) returned from pRmApi->Control(pRmApi, pGpu->hInternalClient, pGpu->hInternalSubdevice, NV>
juin 18 21:22:53 predator kernel: NVRM: Xid (PCI:0000:01:00): 154, GPU recovery action changed from 0x0 (None) to 0x1 (GPU Reset Required)
juin 18 21:22:55 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:55 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:57 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:57 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:22:59 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65
juin 18 21:23:10 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65
juin 18 21:23:14 predator iwd[546]: event: connect-info, ssid: SFR_164E, bss: b0:b3:69:9d:16:51, signal: -49, load: 0/255
juin 18 21:23:14 predator iwd[546]: event: state, old: autoconnect_full, new: connecting (auto)
juin 18 21:23:14 predator iwd[546]: event: connect-failed, status: 1
juin 18 21:23:14 predator iwd[546]: event: connect-info, ssid: SFR_164E, bss: b0:b3:69:9d:16:50, signal: -42, load: 0/255
juin 18 21:23:14 predator iwd[546]: event: connect-failed, status: 1
juin 18 21:23:14 predator iwd[546]: event: state, old: connecting (auto), new: disconnected
juin 18 21:23:14 predator iwd[546]: event: state, old: disconnected, new: autoconnect_full
juin 18 21:23:17 predator wpa_supplicant[696]: wlan0: CTRL-EVENT-SIGNAL-CHANGE above=1 signal=-56 noise=9999 txrate=866700
juin 18 21:23:21 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65
juin 18 21:23:32 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65
juin 18 21:23:41 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:23:41 predator rtkit-daemon[832]: Supervising 8 threads of 5 processes of 1 users.
juin 18 21:23:43 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65
juin 18 21:23:54 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65
juin 18 21:24:05 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65
juin 18 21:24:16 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65
juin 18 21:24:27 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65
juin 18 21:24:38 predator kernel: NVRM: RmCheckForGcxSupportOnCurrentState: NVRM, Failed to get GCx pre-requisite, status=0x65

I would be grateful if you mind take some time to help me.

Have a nice day or evening

Last edited by lsacienne (2025-06-24 20:12:47)

Offline

#2 2025-06-19 09:11:23

Lone_Wolf
Administrator
From: Netherlands, Europe
Registered: 2005-10-04
Posts: 14,951

Re: [SOLVED] Hybrid GPU on Arch Linux not working as expected

I'm facing a really strange problem where I can use vkcube like normal in some cases but in other cases, it just hangs without starting.

Please post the outputs of

$ pacman -Qs vulkan
$ vulkaninfo --summary  #comes with vulkan-tools

Also consider switching to nvidia-open as recommended by nvidia


Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.

clean chroot building not flexible enough ?
Try clean chroot manager by graysky

Offline

#3 2025-06-19 17:38:13

seth
Member
From: Won't reply 2 private help req
Registered: 2012-09-03
Posts: 75,081

Re: [SOLVED] Hybrid GPU on Arch Linux not working as expected

OP is on nvidia-open what

juin 18 21:22:17 predator kernel: NVRM: gpuWaitForGfwBootComplete_TU102: failed to wait for GFW_BOOT: (progress 0x9)
juin 18 21:22:17 predator kernel: NVRM: kgspWaitForGfwBootOk_TU102: failed to wait for GFW boot complete: 0x55 VBIOS version 94.06.16.00.5E
juin 18 21:22:17 predator kernel: NVRM: kgspWaitForGfwBootOk_TU102: (the GPU may be in a bad state and may need to be reset)
juin 18 21:22:23 predator kernel: NVRM: _kgspLogXid119: ********************************* GSP Timeout **********************************
juin 18 21:22:23 predator kernel: NVRM: _kgspLogXid119: Note: Please also check logs above.
juin 18 21:22:23 predator kernel: NVRM: GPU at PCI:0000:01:00: GPU-f451892d-a150-9928-a04a-9712ec218877
juin 18 21:22:23 predator kernel: NVRM: Xid (PCI:0000:01:00): 119, pid=117, name=kworker/7:1, Timeout after 6s of waiting for RPC response from GPU0 GSP! Expected function 76 (GSP_RM_CONTROL) (0x2080205b 0x4).

might be the immediate problem.

Looking at the context:

juin 18 21:22:16 predator kernel: acer_wmi: Unknown function number - 9 - 0
juin 18 21:22:17 predator kernel: NVRM: gpuWaitForGfwBootComplete_TU102: failed to wait for GFW_BOOT: (progress 0x9)
juin 18 21:22:17 predator kernel: NVRM: kgspWaitForGfwBootOk_TU102: failed to wait for GFW boot complete: 0x55 VBIOS version 94.06.16.00.5E

Does this happen when un un/plug the charger?

You might try the binary nvidia driver and disable the https://wiki.archlinux.org/title/NVIDIA … P_firmware but the 575xx drivers don't have a quite stellar track record for now, so don't hold your breath.

Offline

#4 2025-06-19 20:57:12

lsacienne
Member
Registered: 2025-06-18
Posts: 3

Re: [SOLVED] Hybrid GPU on Arch Linux not working as expected

So, like Seth said, I'm currently using nvidia-open drivers. I admit the wiki was not really crystal clear about what driver I should install.

So here, you have the output of  pacman -Qs vulkan

local/lib32-nvidia-utils 575.57.08-1
    NVIDIA drivers utilities (32-bit)
local/lib32-vulkan-icd-loader 1.4.313.0-1
    Vulkan Installable Client Driver (ICD) Loader (32-bit)
local/nvidia-utils 575.57.08-3
    NVIDIA drivers utilities
local/qt6-shadertools 6.9.1-1 (qt6)
    Provides functionality for the shader pipeline that allows Qt Quick to
    operate on Vulkan, Metal, and Direct3D, in addition to OpenGL
local/spirv-tools 1:1.4.313.0-1 (vulkan-devel)
    API and commands for processing SPIR-V modules
local/vulkan-headers 1:1.4.313.0-1 (vulkan-devel)
    Vulkan header files and API registry
local/vulkan-icd-loader 1.4.313.0-1 (vulkan-devel)
    Vulkan Installable Client Driver (ICD) Loader
local/vulkan-intel 1:25.1.3-3
    Open-source Vulkan driver for Intel GPUs
local/vulkan-mesa-layers 1:25.1.3-3
    Mesa's Vulkan layers
local/vulkan-tools 1.4.313.0-1 (vulkan-devel)
    Vulkan tools and utilities
local/vulkan-validation-layers 1.4.313.0-1 (vulkan-devel)
    Vulkan Validation Layers

And here you have the output of vulkaninfo --summary :

==========
VULKANINFO
==========

Vulkan Instance Version: 1.4.313


Instance Extensions: count = 25
-------------------------------
VK_EXT_acquire_drm_display             : extension revision 1
VK_EXT_acquire_xlib_display            : extension revision 1
VK_EXT_debug_report                    : extension revision 10
VK_EXT_debug_utils                     : extension revision 2
VK_EXT_direct_mode_display             : extension revision 1
VK_EXT_display_surface_counter         : extension revision 1
VK_EXT_headless_surface                : extension revision 1
VK_EXT_surface_maintenance1            : extension revision 1
VK_EXT_swapchain_colorspace            : extension revision 5
VK_KHR_device_group_creation           : extension revision 1
VK_KHR_display                         : extension revision 23
VK_KHR_external_fence_capabilities     : extension revision 1
VK_KHR_external_memory_capabilities    : extension revision 1
VK_KHR_external_semaphore_capabilities : extension revision 1
VK_KHR_get_display_properties2         : extension revision 1
VK_KHR_get_physical_device_properties2 : extension revision 2
VK_KHR_get_surface_capabilities2       : extension revision 1
VK_KHR_portability_enumeration         : extension revision 1
VK_KHR_surface                         : extension revision 25
VK_KHR_surface_protected_capabilities  : extension revision 1
VK_KHR_wayland_surface                 : extension revision 6
VK_KHR_xcb_surface                     : extension revision 6
VK_KHR_xlib_surface                    : extension revision 6
VK_LUNARG_direct_driver_loading        : extension revision 1
VK_NV_display_stereo                   : extension revision 1

Instance Layers: count = 8
--------------------------
VK_LAYER_INTEL_nullhw           INTEL NULL HW                1.1.73   version 1
VK_LAYER_KHRONOS_validation     Khronos Validation Layer     1.4.313  version 1
VK_LAYER_MESA_device_select     Linux device selection layer 1.4.303  version 1
VK_LAYER_MESA_overlay           Mesa Overlay layer           1.4.303  version 1
VK_LAYER_MESA_screenshot        Mesa Screenshot layer        1.4.303  version 1
VK_LAYER_MESA_vram_report_limit Limit reported VRAM          1.4.303  version 1
VK_LAYER_NV_optimus             NVIDIA Optimus layer         1.4.303  version 1
VK_LAYER_NV_present             NVIDIA GR2608 layer          1.4.303  version 1

Devices:
========
GPU0:
	apiVersion         = 1.4.311
	driverVersion      = 25.1.3
	vendorID           = 0x8086
	deviceID           = 0x9a49
	deviceType         = PHYSICAL_DEVICE_TYPE_INTEGRATED_GPU
	deviceName         = Intel(R) Iris(R) Xe Graphics (TGL GT2)
	driverID           = DRIVER_ID_INTEL_OPEN_SOURCE_MESA
	driverName         = Intel open-source Mesa driver
	driverInfo         = Mesa 25.1.3-arch1.3
	conformanceVersion = 1.4.0.0
	deviceUUID         = 8680499a-0100-0000-0002-000000000000
	driverUUID         = ca35a7a8-54f3-ef79-a58c-c6c28fe84a1c
GPU1:
	apiVersion         = 1.4.303
	driverVersion      = 575.57.8.0
	vendorID           = 0x10de
	deviceID           = 0x2520
	deviceType         = PHYSICAL_DEVICE_TYPE_DISCRETE_GPU
	deviceName         = NVIDIA GeForce RTX 3060 Laptop GPU
	driverID           = DRIVER_ID_NVIDIA_PROPRIETARY
	driverName         = NVIDIA
	driverInfo         = 575.57.08
	conformanceVersion = 1.4.1.0
	deviceUUID         = f451892d-a150-9928-a04a-9712ec218877
	driverUUID         = 5f139ea4-52d6-527b-8f33-b64f559afd6f

My issue happend while I stay un/plugged. I will try to go back to nvidia driver and try disabling the GSP firmware. I come back to you when I have news. Thank you for your time!

Offline

#5 2025-06-19 22:34:45

Scimmia
Fellow
Registered: 2012-09-01
Posts: 13,724

Re: [SOLVED] Hybrid GPU on Arch Linux not working as expected

First thing, update the system again. There's been an update for the nvidia drivers that fixes a number of bugs.

Offline

#6 2025-06-19 23:21:44

tekstryder
Member
Registered: 2013-02-14
Posts: 510

Re: [SOLVED] Hybrid GPU on Arch Linux not working as expected

Scimmia wrote:

update for the nvidia drivers that fixes a number of bugs.

"Three shall be the number thou shalt count, and the number of the counting shall be three. Four shalt thou not count, neither count thou two, excepting that thou then proceed to three."


Linux x64 (AMD64/EM64T) Display Driver 575.64 wrote:

Fixed a bug that could cause blank rendering on some single-buffered GLX applications when running on Xwayland.
Fixed a bug that could cause a kernel use-after-free on pre-Turing GPUs.
Fixed a bug that could cause 32-bit x86 applications running on recent builds of glibc to crash on dlopen().

Offline

#7 2025-06-19 23:33:23

Scimmia
Fellow
Registered: 2012-09-01
Posts: 13,724

Re: [SOLVED] Hybrid GPU on Arch Linux not working as expected

tekstryder wrote:

"Three shall be the number thou shalt count, and the number of the counting shall be three. Four shalt thou not count, neither count thou two, excepting that thou then proceed to three."

Are you sure about that? https://github.com/NVIDIA/open-gpu-kern … 2981770620
And who knows what other things got left out of the changelog.

Last edited by Scimmia (2025-06-19 23:33:56)

Offline

#8 2025-06-19 23:45:44

tekstryder
Member
Registered: 2013-02-14
Posts: 510

Re: [SOLVED] Hybrid GPU on Arch Linux not working as expected

Hey, don't rain on my Python quote!

Anyhow here's the complete changeset for the open modules:

https://github.com/NVIDIA/open-gpu-kern … 8...575.64

... and the tag link:

https://github.com/NVIDIA/open-gpu-kern … tag/575.64

Tiny release overall, just a bump on the New Feature Branch. 580 will be significant.

Offline

#9 2025-06-20 08:28:31

lsacienne
Member
Registered: 2025-06-18
Posts: 3

Re: [SOLVED] Hybrid GPU on Arch Linux not working as expected

Scimmia wrote:

First thing, update the system again. There's been an update for the nvidia drivers that fixes a number of bugs.

Yeah I know, I religiously update my system as soon as my PC start (or else I'm doing it wrong as I heard). I have pulled the last update :

local/nvidia 575.64-1
    NVIDIA kernel modules

So I uninstalled nvidia-open drivers and reinstalled nvidia drivers. Then I disabled GSP firmware in /etc/modprobe.d/nvidia.conf and now it works like a charm!
vkcube is still using the dGPU even if I have the charger unplugged but it is a normal behavior, so everything is fine.

Thank you for your help!

I think I can mark this post as resolved ?

Last edited by lsacienne (2025-06-20 08:29:02)

Offline

#10 2025-06-20 08:43:19

seth
Member
From: Won't reply 2 private help req
Registered: 2012-09-03
Posts: 75,081

Re: [SOLVED] Hybrid GPU on Arch Linux not working as expected

It's entirely your call (the immediate problem is solved by sidestepping it)

set almost wrote:

Does this happen when you un/plug the charger?

Offline

Board footer

Powered by FluxBB