You are not logged in.
I know this is probably a common issue, but on my gaming laptop I cannot run games on my card. The GPU never wants to start up, always shows as off, but fluctuates CONSTANTLY between 3 and 6 watts. Command "nvidia-smi" returns the following:
Sun Nov 17 14:24:13 2024
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 565.57.01 Driver Version: 565.57.01 CUDA Version: 12.7 |
|-----------------------------------------+------------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+========================+======================|
| 0 NVIDIA GeForce GTX 1650 Ti Off | 00000000:01:00.0 Off | N/A |
| N/A 43C P8 3W / 50W | 6MiB / 4096MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
+-----------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=========================================================================================|
| 0 N/A N/A 621 G /usr/lib/Xorg 4MiB |
+-----------------------------------------------------------------------------------------+
I have also executed the following command, which spams nvidia-smi, shows the GPU as fluctuating, but not on and running: export __NV_PRIME_RENDER_OFFLOAD=1
export __GLX_VENDOR_LIBRARY_NAME=nvidia
After executing this command, I observed the nonexistent change this made and opened minecraft. Still choppy and playing like bare metal. I have a gaming laptop, if that could be the issue. During my installation of essentials through GUI, my mirrors timed out halfway through, this happened twice until I found out networkmanager was being conflicted with, and ran the -Q command for this situation. Nvidia-xconfig does not work either, I get the following output:
WARNING: Unable to locate/open X configuration file.
WARNING: Unable to parse X.Org version string.
ERROR: Unable to write to directory '/etc/X11'.
Last edited by dimdove97 (2024-11-19 07:33:51)
Offline
You don't want nvidia-xconfig to work. Install nvidia-prime and run applications that should land on the dGPU with the prime-run command. https://wiki.archlinux.org/title/PRIME# … er_offload
Which GUI to install essentials are you talking about, are you actually using Arch? Most GUIs are not well supported with Arch's packaging system.
Online
Of course, the arch iso straight from the website onto my computer. I'm pretty sure by essentials I meant the bare minimum. Disk configuration, user, sudouser, and an internet connection. I've reset it now, from using KDE with stable kernel, to GNOME with LTS kernel. And yes, the corresponding (that I know of) nvidia packages.
Games feel like they run on bare metal instead of actually putting the GPU to use, and I am using an external display as well. I can see them running in nvidia-smi, but the GPU always says it's off. I have nvidia-prime installed but it says prime-run not found. The NVIDIA settings also show nothing that lets me set a power profile/change performance. NVIDIA-SMI returns perfectly fine.
It feels like I'm misunderstanding and everything is probably fine, but might just be missing a command or two. That or I'm messing things up as I go. I have no idea if I'm being clear enough. But yeah, slow performance on just the desktop alone, games are sluggish (somewhat laggy but more of a slow-mo effect going on).
EDIT 1AM: Using the application Sober (a port of the Roblox APK to linux), it straight up crashes when I join a game. Starts up just fine, but as soon as I join a game, it instantly closes. It was not doing this before I installed prime. I don't know how to run it in terminal since it is a flatpak, and it is a port akin to WINE. I could probably find out more if I was able to launch it from terminal and provide the crash log.
Last edited by dimdove97 (2024-11-19 05:56:34)
Offline
Games feel like they run on bare metal instead of actually putting the GPU to use
Please post your Xorg log, https://wiki.archlinux.org/title/Xorg#General and the outputs of
glxinfo -B
prime-run glxinfo -B
glxinfo32 -B
prime-run glxinfo32 -B
"games": https://wiki.archlinux.org/title/PRIME# … _using_GPU
Also "bare metal" generally means sth. like "not in a VM", not "on the IGP"
Edit: oh, and edit your OP and wrap the nvidia-smi output in [code][/code] tags.
Last edited by seth (2024-11-19 07:21:51)
Offline
nvidia-smi shows at 3 to 6 watts rather than 0 because nvidia-smi wakes up the GPU every time it runs. If you want to check if the GPU is running without waking it up, try
cat /sys/bus/pci/devices/0000:01:00.0/power/control
Also, I could be wrong, but I don't think you want to be exporting __NV_PRIME_RENDER_OFFLOAD=1. The prime-run script V1del mentioned will let the following command run under prime using that variable; exporting it will make everything run under it, which feels like it'd cause problems.
Offline
Games feel like they run on bare metal instead of actually putting the GPU to use
Please post your Xorg log, https://wiki.archlinux.org/title/Xorg#General and the outputs of
glxinfo -B prime-run glxinfo -B glxinfo32 -B prime-run glxinfo32 -B
"games": https://wiki.archlinux.org/title/PRIME# … _using_GPU
Also "bare metal" generally means sth. like "not in a VM", not "on the IGP"Edit: oh, and edit your OP and wrap the nvidia-smi output in [code][/code] tags.
I cannot do either of these. It says invalid command for all 3, which now I know I probably screwed something up.
nvidia-prime, nvidia-lts, lib32-nvidia-utils, nvidia-settings.
all installed. Linked below is a GitHub tutorial I followed. It did say something about hooks, and editing the modules that load on startup like
nvidia nvidia_modeset nvidia_uvm nvidia_drm
. I don't doubt it has something to do with my issue.
Here is the GH tutorial I followed step for step. https://github.com/korvahannu/arch-nvid … tion-guide
Offline
It says invalid command for all 3
Probably "command not found"?
mesa-utils, lib32-mesa-utils and nvidia-prime packages.
Linked below is a GitHub tutorial I followed.
Offline
Yeah I'm not really the brightest, but I do try my best to keep from flickering.
grep -e Log -e tty Xorg.0.log
returns "no such file or directory". The command to check for modeset
cat /sys/module/nvidia_drm/parameters/modeset
returns permission denied. Sorry for the late responses.
Below are the glxinfo commands' output. I am using GNOME, and an external monitor which is number 1. The laptop screen won't shut off, and I disabled it in display settings.
name of display: :0
display: :0 screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
Vendor: Intel (0x8086)
Device: Mesa Intel(R) UHD Graphics (CML GT2) (0x9bc4)
Version: 24.2.7
Accelerated: yes
Video memory: 5897MB
Unified memory: yes
Preferred profile: core (0x1)
Max core profile version: 4.6
Max compat profile version: 4.6
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.2
OpenGL vendor string: Intel
OpenGL renderer string: Mesa Intel(R) UHD Graphics (CML GT2)
OpenGL core profile version string: 4.6 (Core Profile) Mesa 24.2.7-arch1.1
OpenGL core profile shading language version string: 4.60
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL version string: 4.6 (Compatibility Profile) Mesa 24.2.7-arch1.1
OpenGL shading language version string: 4.60
OpenGL context flags: (none)
OpenGL profile mask: compatibility profile
OpenGL ES profile version string: OpenGL ES 3.2 Mesa 24.2.7-arch1.1
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
name of display: :0
display: :0 screen: 0
direct rendering: Yes
Memory info (GL_NVX_gpu_memory_info):
Dedicated video memory: 4096 MB
Total available memory: 4096 MB
Currently available dedicated video memory: 3621 MB
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: NVIDIA GeForce GTX 1650 Ti/PCIe/SSE2
OpenGL core profile version string: 4.6.0 NVIDIA 565.57.01
OpenGL core profile shading language version string: 4.60 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL version string: 4.6.0 NVIDIA 565.57.01
OpenGL shading language version string: 4.60 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 565.57.01
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
name of display: :0
display: :0 screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
Vendor: Intel (0x8086)
Device: Mesa Intel(R) UHD Graphics (CML GT2) (0x9bc4)
Version: 24.2.7
Accelerated: yes
Video memory: 5897MB
Unified memory: yes
Preferred profile: core (0x1)
Max core profile version: 4.6
Max compat profile version: 4.6
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.2
OpenGL vendor string: Intel
OpenGL renderer string: Mesa Intel(R) UHD Graphics (CML GT2)
OpenGL core profile version string: 4.6 (Core Profile) Mesa 24.2.7-arch1.1
OpenGL core profile shading language version string: 4.60
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL version string: 4.6 (Compatibility Profile) Mesa 24.2.7-arch1.1
OpenGL shading language version string: 4.60
OpenGL context flags: (none)
OpenGL profile mask: compatibility profile
OpenGL ES profile version string: OpenGL ES 3.2 Mesa 24.2.7-arch1.1
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
name of display: :0
display: :0 screen: 0
direct rendering: Yes
Memory info (GL_NVX_gpu_memory_info):
Dedicated video memory: 4096 MB
Total available memory: 4096 MB
Currently available dedicated video memory: 3621 MB
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: NVIDIA GeForce GTX 1650 Ti/PCIe/SSE2
OpenGL core profile version string: 4.6.0 NVIDIA 565.57.01
OpenGL core profile shading language version string: 4.60 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL version string: 4.6.0 NVIDIA 565.57.01
OpenGL shading language version string: 4.60 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 565.57.01
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
A
Offline
https://wiki.archlinux.org/title/Xorg#General describes where to find hte log, don't blindly copypaste blue boxes into your terminal - not even from the wiki or this forum.
Then post the entire file, not some grep.
You're running on the intel chip, but prime invokes the nvidia GPU as expected.
See https://wiki.archlinux.org/title/PRIME# … _using_GPU again.
Offline
Can you calm down? Folder doesn't exist, can't find shit on the wiki because it all leads on and on and on..... So many damn commands and options, it's like walking into a candy store. Ran this command, looked at it tried my best to understand it, no idea if I got it correct but "# ln -s /dev/null /etc/udev/rules.d/61-gdm.rules". It's kinda like taking a kid and sending him into one of Shakespeare's plays.
Last edited by dimdove97 (2024-11-19 22:19:38)
Offline
I'm certainly not in any state from where I'd have to calm down.
Maybe take another look tomorrow.
Offline
Yeah that's alright. But just because you're angry don't take it out on me. Who knows maybe I'll find it myself. Hope your day gets better.
Offline
Not sure whether this is trolling or a language barrier…
/I/ am neither angry nor upset nor anything else that would require calming down.
I cannot speak for other people in this thread, though.
The linked wiki explains all possible locations of the Xorg log, depending on how X11 is started.
Take another look tomorrow.
Offline
Yeah I misread it, I'm going to eat my own words. Sorry for wasting your time or assuming you're angry.
Offline
So after re-reading https://wiki.archlinux.org/title/Xorg#General could you locate your xorg log?
However
You're running on the intel chip, but prime invokes the nvidia GPU as expected.
See https://wiki.archlinux.org/title/PRIME# … _using_GPU again.
If you struggle with that, please elaborate what exactly you're trying to run on the nvidia GPU how.
Offline