You are not logged in.
Hello! I have an Dell 250g8 notebook and wanted to play some games using Lutris.
In theory the IGP of the included i5-1135G7 is good enough for some older games I have installed, like Thief (2014) or Orcs Must Die 3.
But the games run very slow, like single digit fps for OMD3 and about 20fps in the old Thief game, both at the very lowest setting for any parameter and reduced resolution of 1280x720 instead of native 1080p.
Using intel_gpu_top, I then saw, that the IGP is runnig at a very low clock rate most of the time. More precisely, it goes up to a decent clock rate for this chip and in this notebook (about 1300MHz) for about three seconds when starting a game, to then go down to about 300MHz. Switching from fullscreen game to desktop using Alt-Tab and switching back, will instantly reset the clock to maxium like 1300 MHz) to then go down after about 3 to 5 seconds again, everytime as often as I do so. So it's no temperature or power issue (also it's working fine on Windows, as described below). (Thief is running great within these first few seconds at full clock, at about 120fps!)
Here's part of on intel_gpu_top log I made:
Freq MHz IRQ RC6 Power W RCS/0 BCS/0 VCS/0 VCS/1 VECS/0
req act /s % gpu pkg % se wa % se wa % se wa % se wa % se wa
1295 240 498 0 1.51 11.96 99.99 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0
1306 251 448 0 1.56 11.91 99.99 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0
1295 233 376 0 1.50 11.95 99.99 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0
1298 241 348 0 1.53 11.89 99.99 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0
1301 247 453 0 1.55 12.00 99.99 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0
1301 238 500 0 1.51 11.93 99.99 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0
1299 245 538 0 1.51 11.91 99.99 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0
1298 247 419 0 1.52 11.87 99.99 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0
1294 266 487 1 1.58 11.93 99.25 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0
1304 258 514 0 1.58 11.94 99.99 0 0 0.00 0 0 0.00 0 0 0.00 0 0 0.00 0 0
You can see the req (requested?) frequency is allright, but the actual frequency is very low all the time (besides said frist few seconds, which aren't visibile in this log!). Also visible is the IGP only using less than 2W of the available 12W power.
I then tried what I could find to improve the performance:
1. Uninstalled xf86-videol-intel (Subjectively I'd say this made the general desktop's GUI more responsive, but it did not fix the issue of the very low clock rates.)
2. Switched from i915 to xe driver by adding "i915.force_probe=!9a49 xe.force_probe=9a49" to the kernel command line. (It feels like this actually made the desktop a bit slower again, but the game a little faster. Can not measure fps or clocks with xe-drivers, because intel_gpu_top is incompatible. Seemingly the games run a little bit better (like 10%?) using the xe driver, but still way too bad
3. Added "dev.i915.perf_stream_paranoid=0" to /etc/sysctl.d/99-sysctl.conf but did not notice any difference. Clocks are still at 300 or lower.
4. Set governor='performance' and perf_bias=0 in /etc/default/cpupower (Probably this only is for the CPU part of the chip, but I wanted to try it anyway..)
After all this did not help, I tried Unigine Superposition benchmark. For some reason the clocks went up to about 750MHz during the benchmark instead of the aforementioned 300MHz during the games. (Maybe because Unigine SP is using OpenGL, while the games run using DXVK?) Anyway, this of course is way too little clock.
I then wanted to make sure it's not a hardware or BIOS problem. So I installed Unigine Superposition on the Windows11 installation that's coexisting as dual boot option on this notebook, exclusively for trouble shooting purposes. Unigine Superposition ran between 1100MHz and 1200MHz all the benchmark long on this Windows installation. Temperatures of the IGP and CPU in total are fine btw. (both at around 60 degress on full load).
What could be the reason for the GPU clock staying too low on Arch/DXVK?
Thank you! ![]()
Last edited by Elmario (2024-05-09 19:37:17)
Offline
I just noticed:
Already on the idle desktop inxi -Fxxxrz is showing a much higher CPU temperature than coretemp and ACPI (as shown by the sensors-module in my xfce4-panel).
It's about 75 vs. 55 degrees C.
As running the notebook on Windows also only ever shows temperatures below 62 degrees C while running a game for some minutes, I think that inxi is wrong.
Maybe the Linux graphics driver relies on the same method as inxi to determine the chip temperature thus thinking it was too hot for runing a higher clock? (Which pretty sure would be a bug, as it's running great at full clock and way lower temperatures on Windows).
Can someone help me, please?
Last edited by Elmario (2024-05-12 15:45:30)
Offline
Sigh ![]()
Since the Kernel update from two days, ago, the XE driver has become totaly useless. FPS have gone down to like 5 in Thief.
I now had to switch back to i915 driver for at least having about 15fps in Thief.
Is there no chance for improvement left?
The Kernel update btw. broke some stuff, like the system not being able to boot anymore because of mkinitcpio issues and vfat (for efi partition) not being found.. I fixed this, but maybe even something with the power / governor settings changed again (because of the performance being even worse than before..)?
Last edited by Elmario (2024-05-20 02:44:34)
Offline
Cut out wine/proton, how does the GPU behave w/ an SDL game like https://archlinux.org/packages/extra/x86_64/warsow/ ?
Next
Switching from fullscreen game to desktop using Alt-Tab and switching back, will instantly reset the clock to maxium like 1300 MHz) to then go down after about 3 to 5 seconds again, everytime as often as I do so.
what are display server and compositor?
In doubt try the behavior on an (uncomposited) X11/openbox session.
Offline
Hello!
the game is running fine. Between 175 and 250 fps using i915 driver and between 170 and 220fps using the new XE driver (all ingame settings at default).
My setup is a standard XFCE-DE:
X11 / lightdm / xfwm4
I have Lutris configured to automatically disable compositing when a game is started, which I checked and it reliably does disable the compositor.
Thank you!
Last edited by Elmario (2024-05-21 12:38:41)
Offline
So more a steam/proton-related issue. Likely you end up on SW rendering.
Do you do anything like https://wiki.archlinux.org/title/Steam/ … l_hardware ?
Do you get the same w/ https://wiki.archlinux.org/title/Steam/ … _emulation ?
Do you have https://archlinux.org/packages/?name=lib32-vulkan-intel ?
What's the output of
vulkaninfo?
Offline
So more a steam/proton-related issue. Likely you end up on SW rendering.
I don't actually feel like it was this bad. But what do I know .. not much ![]()
Do you do anything like https://wiki.archlinux.org/title/Steam/ … l_hardware ?
Do you get the same w/ https://wiki.archlinux.org/title/Steam/ … _emulation ?
Not yet, as I am not using Steam. Would these variables apply if I entered them as environment variables in Lutris for a game in question? (Well, I will try it anyway right now!)
Do you have https://archlinux.org/packages/?name=lib32-vulkan-intel ?
Yes, this package is installed.What's the output of
vulkaninfo?
Vulkaninfo is here: vulkaninfo
Thank you very much, Seth! ![]()
PS: I moved the vulkaninfo output to snipped.host for readability of the thread, as the forum does not seem to support [spoiler] (or am I doing it wrong?).
PPS: In Lutris I just saw there was one of these variables and some others already set:
I will try decreasing major and minor versions arbitrarily, as I got no clue how to find which version numbers would actually make much sense trying
.
Last edited by Elmario (2024-05-21 13:10:04)
Offline
Don't override the GL version at all.
The vulkaninfo looks fine.
To add a reference point: Does the IGP step up (as expected) when running warsow?
Offline
Don't override the GL version at all.
Sorry I misunderstood. I though you where asking for using the override.
I now removed the before existing Override to GL 4.4COMPAT. This made no change in the game's performance.
To add a reference point: Does the IGP step up (as expected) when running warsow?
The highest I saw the GPU clock go up to within some minutes of playing was 700/1300 (act/req) as shown in intel_gpu_top, while most of the time running around 450MHz.
Still just about half the nominal clock (and way less than "requested?"), but I think this might be because of Warsow probably being fps capped, as it always tops out at exactly 250fps.
Thief (@Proton 9.0) is running at around 250/1300 to 300/1300 most of the times, with occasional spikes up to like 400MHz and this at just 1280x720 resolution with around 15 to 25fps.
So Warsow, while still not reaching the requested GPU clock, is runnig at a much higher clock compared to Thief.
Offline
I found that Warsow has the option to incrase fps cap to 500. But this did no increase performance, and clocks are same as with fps cap set to 250.
Offline
https://bbs.archlinux.org/viewtopic.php?id=280492
https://bbs.archlinux.org/viewtopic.php?id=284961
https://bbs.archlinux.org/viewtopic.php?id=273501
Were all on hybrid systems w/ bad vulkan selection - try to export "PROTON_USE_WINED3D=1" - do you have other vulkan drivers installed?
ls /usr/share/vulkan/icd.d/Offline
Hello!
There is no discrete GPU in my notbook, it only got the CPU's IGP.
I tried "PROTON_USE_WINED3D=1" but it didn't make much of a difference besided the Epic launcher's UI not displaying properly. The game itself runs same as before.
I only have the intel IGP driver and what I think is the haswell architecture version installed:
ladmin@250g8:~$ls /usr/share/vulkan/icd.d/
intel_hasvk_icd.i686.json intel_icd.i686.json
intel_hasvk_icd.x86_64.json intel_icd.x86_64.json
The general behaviour looks to me, like it was a power or thermal issue .. but it's running fine on Windows so this doesn't make sense ..
Last edited by Elmario (2024-05-22 19:12:00)
Offline
Warsow steps up the IGP (it might be CPU limited when trying to run at 500fps) so it's not the hardware or false sensor read.
If proton on D3D emulation via opengl is slow, it's also not vulkan.
Still the cause must be related to proton - do you get the same performance issues w/ the regular wine?
Maybe we're also looking at the wrong component - do you use pipewire/wireplumber/pipewire-pulse?
https://bbs.archlinux.org/viewtopic.php … 0#p2172010
Offline
I am using pure Pulseaudio, there's no Pipewire component installed, because I experienced multiple issues with Pipewire before.
As you mentioned CPU limit I did a check:
1. The game running at 640x480
2. The game running at 1920x1080
These tests where done using bare System-Wine (9.8).
So we probably may be getting closer. There is an abnormal high CPU usage. I guess the topmost 100% in htop's process list mean full utilization by one single threaded process.
I didn't even think of cpu bottlenecking, yet, as obviously today the CPU rarely is the limiting factor, especially when in combination with a puny IGP ..
The first few seconds after switchinginto the game's screen (while the fps are great) CPU usage for this topmost process is at just about 30%.
Last edited by Elmario (2024-05-23 17:03:51)
Offline
Must be something with wine. That computer should not be that slow. Try winetricks.
Offline
Is only that game affected or also other (all) wine games?
Offline
Must be something with wine. That computer should not be that slow. Try winetricks.
What would I do using winetricks?
Is only that game affected or also other (all) wine games?
Orcs must die 3 and Dungeon of Naheulbeuk, all had the same issue. Rather simple game graphics, but I had to play them around 15 fps.
Last edited by Elmario (2024-05-24 18:37:13)
Offline
OK: Just out of desperation and toying around, I switched DXVK_HUD to full.
So I saw, that the reported DXVK Version always is 1.10.1.
I thought maybe Lutris isn't working correctly and so I downloaded the latest DXVK (2.3.1) and manually copied the dlls into the prefixe's system32 and soywow64 directories. Actually the dlls in there before had a different size, despite Lutris is set to also use DXVK 2.3.1.
Then I used winecfg to override the four DXVK dlls to native, which also was not done. I thought this was strange, but maybe Lutris has another way of doing these things and usually switching between Proton and DXVK versions in the Lutris UI has an noticeable impact for me...
After these changes, I started the Epic Launcher using a command, just to be sure:
WINEPREFIX="/home/ladmin/Games/epic-games-store" /home/ladmin/.local/share/lutris/runners/wine/wine-ge-8-26-x86_64/bin/wine "/home/ladmin/Games/epic-games-store/drive_c/Program Files (x86)/Epic Games/Launcher/Portal/Binaries/Win64/EpicGamesLauncher.exe"
But: Still DXVK shows version 1.10.1!
I don't know if the outdated version 1.10.1 would have such an negative impact on performance - but no matter what, this is not what it's supposed to be..
Last edited by Elmario (2024-05-25 22:32:22)
Offline
https://forums.lutris.net/t/solved-why- … vapi/14247 makes it look like there's a lutris setting to disable dxvk?
Does that make any difference?
Offline
In Lutris you can set about any possible option and parameter for running Executables in Wine/Proton/WineGE using any arbitrary DXVK version and the needed addons.
And usually this worked before. The issue of my changes having no effect is new and in my previous try (in post #18) I did not even use Lutris anymore - and still the DXVK version would not change from 1.10.1, not even when not using DXVK at all (which is the case you where asking for).
But: After fiddling around and changing anything from enabled to disabled and back, and changing versions, yesterday, the shown version of DXVK changed to 2.3.1 - and is staying there since. It did not have any noteable or measureable impact on the game's performance.
But I made another observation because of DXVK_HUD being set to 'full' now: The 100% CPU load issue I reported is wrong, or at least it's not the reason for the bad fps. It's just shaders being compiled, which depending on where one is standing in the game, can take up to about 5 minutes, using all 8 cores to 100%. After these 5 minutes CPU load shader compilation ist mostly done and CPU load goes down to normal levels, like 30%
Also I noticed epic launcher complaining about the needed Windows version being at least 'Widows 10 64bit' - which is what the prefix is set to.
Mabe simply the Prefix is corrupted somehow - So I think I will now install Epic in a completely new prefix and see if his changes anything.
Last edited by Elmario (2024-05-26 20:49:09)
Offline
OK: I completely uinstalled lutris wine winetricks wine-gecko wine-mono using 'pacman -Rns', manually deleted all remaining files and directories, and reinstalled using the --overwrite option.
Now switching from DXVK to Wine3D and in between different DXVK versions is working reliably again.
Performance wise it's all same as before. With DXVK disabled the game runs bad from the beginning but it also has graphical errors after some minutes. The Epic Launcher has some issues without DXVK, too, like invisible text. Both is normal from my experience, without DXVK, even on my old notebook with an nvidia GPU.
With DXVK enabled it runs great for some seconds to then drop to around 17fps.
Last edited by Elmario (2024-05-26 22:34:03)
Offline
Ignore the context, does it help to disable powersaving features:
https://wiki.archlinux.org/title/Intel_ … Intel_CPUs
What if you keep and (unsync'd) glxgears running next to the game?
(Does that generally prevent the IGP from stepping down?)
The idea is to use glxgears to keep the IGP busy and in higher frequencies and allow the game to benefit from that.
Offline
Just for making sure that we are not both wasting a lot of time here, I tried the actual game on Windows (As a reminder: I only had the Unigine Superposition-Benchmark for comparison, yet, but not a real game).
But, yes: The game runs great on Windows. Fluid and at 1920x1080. GPU is mostly between 1200 and 1300 MHz.
I tried the glxgears trick, but the GPU then did not reach a sufficient frequency either.
Glxgears alone as well as combined to the game makes the GPU run at about 500 to 600 MHz.
I will try disabling the powersaving features next.:
Edit: It did not help.
Kernel output is here: dmesg
There are some warnings about i915 ... No idea if these are of importance.
Last edited by Elmario (2024-05-27 19:01:16)
Offline
https://snippet.host/ouongb is the vulkaninfo
https://archlinux.org/packages/extra/x8 … gpu-tools/
https://manpages.ubuntu.com/manpages/ja … ncy.1.html (upstream doesn't provide a manpage, can't tell whether the ubuntu one is -still- accurate)
Offline
Sorry, I corrected the link (it's https://snippet.host/jerkkw )
I tried intel_cpu_frequency. It's there and accepts the parameters. Using it we can see a discrepancy again:
ladmin@250g8:~$sudo intel_gpu_frequency -m
ladmin@250g8:~$sudo intel_gpu_frequency -g
cur: 1267 MHz
min: 1300 MHz
RP1: 350 MHz
max: 1300 MHz
So intel_cpu_frequency thinks the GPU was running at max clock.
But intel_gpu_top and the low game performance say otherwise.
I issued sudo intel_gpu_frequency -m (for consistent max clock) multiple times durign the seconds that intel_gpu_top was logging in the other window.
To no effect. I also tried -s 1200 for setting frequency to 1200 directly. No effect, too.
Offline