You are not logged in.
I used to be able to run paradox games fine on my machine (haven't tried in a number of months), but now CK2, HOI4, Stellaris, and EU4 are all running the UI and game at 1-3fps once the map loads while the cursor moves normally. Htop shows CPU utilization max out across all 16 threads on my 3700x once the map loads. When I lower graphics settings to the minimum, it runs a little better (jumped up to ~12fps in HOI4).
My other non-Paradox games continue to run well on my system, and Paradox games run normally on my Windows installation on the same hardware.
I've used the -debug flag and gathered logs and shared them with Paradox support, but they couldn't find an issue. I tried downgrading to the LTS kernel but that the problem persisted. Does anyone have any troubleshooting ideas or suggestions? And does anyone with similar hardware have any issues?
Last edited by dioramaramen (2020-09-13 15:24:04)
Offline
Sounds like 32bit acceleration is broken, is lib32-nvidia-utils set up? What's your output for
glxinfo -B
glxinfo32 -B
needs mesa-demos and lib32-mesa-demos respectively. And post these logs you have apparently gathered.
Offline
Thanks for answering!
Looks like lib32-nvidia-utils is installed, pacman -Q lib32-nvidia-utils outputs:
lib32-nvidia-utils 450.66-1
glxinfo -B
name of display: :0
display: :0 screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
Vendor: VMware, Inc. (0xffffffff)
Device: llvmpipe (LLVM 10.0.1, 256 bits) (0xffffffff)
Version: 20.1.7
Accelerated: no
Video memory: 32125MB
Unified memory: no
Preferred profile: core (0x1)
Max core profile version: 3.3
Max compat profile version: 3.1
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.1
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: llvmpipe (LLVM 10.0.1, 256 bits)
OpenGL core profile version string: 3.3 (Core Profile) Mesa 20.1.7
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profileOpenGL version string: 3.1 Mesa 20.1.7
OpenGL shading language version string: 1.40
OpenGL context flags: (none)OpenGL ES profile version string: OpenGL ES 3.1 Mesa 20.1.7
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.10
glxinfo32 -B
name of display: :0
display: :0 screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
Vendor: VMware, Inc. (0xffffffff)
Device: llvmpipe (LLVM 10.0.1, 256 bits) (0xffffffff)
Version: 20.1.7
Accelerated: no
Video memory: 32125MB
Unified memory: no
Preferred profile: core (0x1)
Max core profile version: 3.3
Max compat profile version: 3.1
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.1
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: llvmpipe (LLVM 10.0.1, 256 bits)
OpenGL core profile version string: 3.3 (Core Profile) Mesa 20.1.7
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profileOpenGL version string: 3.1 Mesa 20.1.7
OpenGL shading language version string: 1.40
OpenGL context flags: (none)OpenGL ES profile version string: OpenGL ES 3.1 Mesa 20.1.7
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.10
Log files produced by opening HOI4 and starting a game:
game.log
system.log
system_debug.log
memory.log
pdxmp.log
There other logfiles are either empty or didn't seem to have anything that looked useful.
Offline
The outputs and logs suggest something is wrong with your graphics setup and the system falls back to llvmpipe software rendering.
please post full outputs of lspci -k , journal and xorg logs.
If you (or some tool) created .conf files in /etc/X11 or /etc/X11/xorg.conf.d post their contents also
Some of those outputs will be large, consider using a pastebin client
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
(A works at time B) && (time C > time B ) ≠ (A works at time C)
Offline
Thank you for the help!!
lspci -k
journalctl -b
Xorg.0.log
xorg.conf (generated by nvidia-settings)
Last edited by dioramaramen (2020-09-12 18:22:51)
Offline
Looks very similar to the race condition described here https://bbs.archlinux.org/viewtopic.php?id=258201
Check https://wiki.archlinux.org/index.php/NV … de_setting
I suggest you
- remove that xorg.conf
- add nvidia-drm.modeset=1 as kernel boot parameter in your bootloader
- reboot, start X
post xorg log
If starting X does work now , also post
$ glxinfo -B # comes with mesa-demos
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
(A works at time B) && (time C > time B ) ≠ (A works at time C)
Offline
Looks very similar to the race condition described here https://bbs.archlinux.org/viewtopic.php?id=258201
Check https://wiki.archlinux.org/index.php/NV … de_setting
I suggest you
- remove that xorg.conf
- add nvidia-drm.modeset=1 as kernel boot parameter in your bootloader
- reboot, start Xpost xorg log
If starting X does work now , also post
$ glxinfo -B # comes with mesa-demos
Yup, I found a similar thread that gave similar info/instructions: https://bbs.archlinux.org/viewtopic.php?id=258360
I ended up adding the nvidia-drm.modeset=1 as kernel boot parameter, added "nvidia nvidia_modeset nvidia_drm" to MODULES in mkinitcpio.conf, regenerated initramfs with mkinitpcio -P (ended up needing to create a new EFI boot parition because the Microsoft generated one didn't have enough space), and added a pacman hook to update initramfs when the nvidia driver updates.
The games are now working properly and glxinfo -B now shows the correct renderer is being used:
name of display: :0
display: :0 screen: 0
direct rendering: Yes
Memory info (GL_NVX_gpu_memory_info):
Dedicated video memory: 8192 MB
Total available memory: 8192 MB
Currently available dedicated video memory: 8036 MB
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 1070/PCIe/SSE2
OpenGL core profile version string: 4.6.0 NVIDIA 450.66
OpenGL core profile shading language version string: 4.60 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profileOpenGL version string: 4.6.0 NVIDIA 450.66
OpenGL shading language version string: 4.60 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 450.66
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
Thanks again for the help!!
EDIT: My system started crashing after this, so I've removed the kernel parameter but left the entries in MODULES which still correctly loads the nvidia driver but has not yet caused another crash.
Last edited by dioramaramen (2020-09-15 15:15:02)
Offline