You are not logged in.

#1 2021-01-02 21:17:40

jlindgren
Member
Registered: 2011-02-27
Posts: 256

High idle CPU/power usage with intel+nvidia reverse PRIME

I have an HP laptop (Zbook 15 G3) which has the internal display connected to the Intel GPU and HDMI-out connected to the Nvidia GPU.
With some xorg.conf magic, I'm able to get good performance from a traditional X11 multi-head setup (laptop display on :0.0 w/ modesetting and HDMI on :0.1 w/ nvidia).
However, without xorg.conf, X starts up in a reverse PRIME setup where (if I understand it correctly) the Intel GPU draws everything and the Nvidia GPU simply copies framebuffers.
This setup (reverse PRIME) would be great except that it seems to have a significant performance impact.

With traditional multi-head (customized xorg.conf), Xorg CPU usage idles at <1%.
powertop reports 9-10 W power draw on battery, with ~25 events/s from "irq/134-nvidia".

With reverse PRIME (all defaults, no xorg.conf), Xorg CPU usage idles at 45-50% of one core.
powertop reports 14-17 W power draw on battery, with ~200 events/s from "irq/134-nvidia" and ~60 events/s from "nvidia-modeset".

The laptop display is 1920x1080 and the HDMI monitor is 2560x1440.
I normally run picom, but the power draw is about the same with or without picom running.

I'm using nvidia-beta-dkms 460.27.04-1 from AUR (hence posting in "AUR Issues") with linux 5.10.3.arch1-1, but this doesn't seem to be a new issue with 460.x.
nvidia from extra (455.45.01) also had similar performance problems with linux 5.9 (and linux 5.10 broke reverse PRIME completely with that driver).

I can post logs, configs, etc., but for starters, I'm wondering if this kind of performance hit is typical/expected with reverse PRIME?

The ~60 event/s from "nvidia-modeset" makes me suspect that the Nvidia GPU is constantly copying frames from the Intel GPU even at idle.

Offline

#2 2021-01-02 22:37:58

Lone_Wolf
Member
From: Netherlands, Europe
Registered: 2005-10-04
Posts: 11,866

Re: High idle CPU/power usage with intel+nvidia reverse PRIME

Your terms are incorrect.

On a hybrid graphics system without configuration X starts up with intel gpu as primary and nvidia as secondary gpu .
In that mode only the displays connected to the primary work.
The secondary gpu can use prime render offloading to send images to the displays connected to the primary .

When configured for reverse prime, the primary gpu can send images to displays connected to the secondary gpu or to itself.

Before we can comment on anything we need to understand how you have setup things, please post
for "traditional X11 multi-head setup"
- the config files
- xorg log

$ xrandr --listproviders
$ glxinfo -B

for X without config
- xorg log

$ xrandr --listproviders
$ glxinfo -B

Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.


(A works at time B)  && (time C > time B ) ≠  (A works at time C)

Offline

#3 2021-01-02 23:22:41

jlindgren
Member
Registered: 2011-02-27
Posts: 256

Re: High idle CPU/power usage with intel+nvidia reverse PRIME

Lone_Wolf wrote:

Your terms are incorrect.

Quite likely, not sure which ones though.

Without xorg.conf:

$ xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x49 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 3 outputs: 7 associated providers: 1 name:modesetting
Provider 1: id: 0x2bf cap: 0x2, Sink Output crtcs: 4 outputs: 6 associated providers: 1 name:NVIDIA-G0

$ glxinfo -B
name of display: :0
display: :0  screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
    Vendor: Intel (0x8086)
    Device: Mesa Intel(R) HD Graphics 530 (SKL GT2) (0x191b)
    Version: 20.3.2
    Accelerated: yes
    Video memory: 3072MB
    Unified memory: yes
    Preferred profile: core (0x1)
    Max core profile version: 4.6
    Max compat profile version: 4.6
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 3.2
OpenGL vendor string: Intel
OpenGL renderer string: Mesa Intel(R) HD Graphics 530 (SKL GT2)
OpenGL core profile version string: 4.6 (Core Profile) Mesa 20.3.2
OpenGL core profile shading language version string: 4.60
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile

OpenGL version string: 4.6 (Compatibility Profile) Mesa 20.3.2
OpenGL shading language version string: 4.60
OpenGL context flags: (none)
OpenGL profile mask: compatibility profile

OpenGL ES profile version string: OpenGL ES 3.2 Mesa 20.3.2
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20

Log: https://gist.github.com/jlindgren90/c2f … 5da30dede6

My xorg.conf:

Section "ServerFlags"
	Option      "AutoAddGPU" "0"
	Option      "AutoBindGPU" "0"
	Option      "DefaultServerLayout" "Layout0"
EndSection

Section "ServerLayout"
	Identifier  "Layout0"
	Screen 0    "Screen0" 0 0
	Screen 1    "Screen1" RightOf "Screen0"
EndSection

Section "Monitor"
	Identifier  "IgnoredMonitor0"
	Option      "Ignore" "1"
EndSection

Section "Device"
	Identifier  "Card0"
	Driver      "modesetting"
	BusID       "PCI:0:2:0"
	Option      "Monitor-HDMI-1" "IgnoredMonitor0"
EndSection

Section "Device"
	Identifier  "Card1"
	Driver      "nvidia"
	BusID       "PCI:1:0:0"
EndSection

Section "Screen"
	Identifier  "Screen0"
	Device      "Card0"
EndSection

Section "Screen"
	Identifier  "Screen1"
	Device      "Card1"
	SubSection  "Display"
		Virtual 2560 1440
	EndSubSection
EndSection

With that xorg.conf:

$ xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x48 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 3 outputs: 6 associated providers: 0 name:modesetting

$ glxinfo -B
name of display: :0
libGL error: No matching fbConfigs or visuals found
libGL error: failed to load driver: swrast
display: :0  screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
    Vendor: Intel (0x8086)
    Device: Mesa Intel(R) HD Graphics 530 (SKL GT2) (0x191b)
    Version: 20.3.2
    Accelerated: yes
    Video memory: 3072MB
    Unified memory: yes
    Preferred profile: core (0x1)
    Max core profile version: 4.6
    Max compat profile version: 4.6
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 3.2
OpenGL vendor string: Intel
OpenGL renderer string: Mesa Intel(R) HD Graphics 530 (SKL GT2)
OpenGL core profile version string: 4.6 (Core Profile) Mesa 20.3.2
OpenGL core profile shading language version string: 4.60
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile

OpenGL version string: 4.6 (Compatibility Profile) Mesa 20.3.2
OpenGL shading language version string: 4.60
OpenGL context flags: (none)
OpenGL profile mask: compatibility profile

OpenGL ES profile version string: OpenGL ES 3.2 Mesa 20.3.2
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20



display: :0  screen: 1
direct rendering: Yes
Memory info (GL_NVX_gpu_memory_info):
    Dedicated video memory: 2048 MB
    Total available memory: 2048 MB
    Currently available dedicated video memory: 1877 MB
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: Quadro M1000M/PCIe/SSE2
OpenGL core profile version string: 4.6.0 NVIDIA 460.27.04
OpenGL core profile shading language version string: 4.60 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile

OpenGL version string: 4.6.0 NVIDIA 460.27.04
OpenGL shading language version string: 4.60 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)

OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 460.27.04
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20

Log: https://gist.github.com/jlindgren90/19c … 18a564dd60

Offline

#4 2021-01-03 00:33:31

Lone_Wolf
Member
From: Netherlands, Europe
Registered: 2005-10-04
Posts: 11,866

Re: High idle CPU/power usage with intel+nvidia reverse PRIME

Ok, the setup without xorg conf is a typical hybrid graphics setup.
The igpu drives the internal screen and if more performance is needed, prime render offload can be used for extra performance .
The external screen connected to the nvidia card is not accessible in this setup.

(By adding necessary configuration for reverse prime you can let the intel gpu access the external screen, but doing this disables prime offload.)

This setup is mainly used to conserve power by using the lesser performant integrated card by default and only use the discrete card when really needed.


Your second setup is one I haven't seen before. No idea what to call it, but it's definitely not a "traditional X11 multi-head setup" .

Screen 0 is connected over hdmi to the intel card, and you tell xrandr to ignore it .
Screen 1 is connected to the nvidia card and seems to be  a virtual screen.

xrandr only sees 1 provider , the intel card which doesn't have  a monitor connected to it as far as xrandr is concerned.

The glxinfo -B output shows the intel card can only do very basic stuff since libgl for it isn't working .
Your nvidia card however has full GL / GLX support.

Frankly, I have no clue how X manages to combine those settings into a working setup .

Can both screens display in that setup ?
What do you want to achieve with it ?

The usual methods for  a hybrid graphics setup with intel+nvidia are listed at https://wiki.archlinux.org/index.php/NV … le_methods .


Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.


(A works at time B)  && (time C > time B ) ≠  (A works at time C)

Offline

#5 2021-01-03 01:56:51

jlindgren
Member
Registered: 2011-02-27
Posts: 256

Re: High idle CPU/power usage with intel+nvidia reverse PRIME

Lone_Wolf wrote:

The external screen connected to the nvidia card is not accessible in this setup.

Are you very, very sure of that? The facts seem to disagree with you smile I see stuff displayed on that external screen (in both setups).

If you're not aware, note that Arch (following Fedora) patches X.org to enable reverse PRIME automatically nowadays:
https://github.com/archlinux/svntogit-p … plug.patch

Lone_Wolf wrote:

Screen 0 is connected over hdmi to the intel card, and you tell xrandr to ignore it .
Screen 1 is connected to the nvidia card and seems to be  a virtual screen.

So if you look at the first Xorg.0.log (without xorg.conf) closely, you'll see that both the intel and nvidia cards can "see" the external HDMI monitor during init. There is a single physical connector which is muxed between the two cards; intel calls it "HDMI-1" and nvidia calls it "DFP-1".

[     9.614] (II) modeset(0): EDID for output HDMI-1
[     9.614] (II) modeset(0): Manufacturer: LEN  Model: 65d2  Serial#: 16843009
...
[     9.615] (II) modeset(0): Monitor name: L24q-20
...
[    10.819] (--) NVIDIA(GPU-0): Lenovo Group Limited L24q-20 (DFP-1): connected

Once the nvidia driver initializes, it gets control of the mux (I am not sure of the exact mechanism for this), and it is video from the nvidia card that ends up being displayed. HDMI-1 becomes disconnected according to xrandr, and the monitor displays what the nvidia card is pushing on DFP-1.

In the default setup (no xorg.conf), DFP-1 (driven by the nvidia card) becomes part of screen :0.0 along with eDP-1 (driven by the intel card) via reverse PRIME, which as I noted is enabled automatically in Arch nowadays.

In the xorg.conf setup, I have disabled reverse PRIME and set up the nvidia card's outputs (including DFP-1) on screen :0.1, while the intel outputs (including eDP-1) remain on screen :0.0. I tell the intel card to ignore HDMI-1 because otherwise it appears as connected briefly during init (before the nvidia driver takes over the mux) and messes with the multi-head layout.

I appreciate the effort to help, but it's a complicated setup and I think you're making some conclusions about it that aren't accurate. Therefore I'm not going to respond to the rest of your post until some of the confusion is cleared up and we're on the same page about how things are actually working.

(edit: typo)

Last edited by jlindgren (2021-01-03 02:04:20)

Offline

#6 2021-01-03 02:14:01

jlindgren
Member
Registered: 2011-02-27
Posts: 256

Re: High idle CPU/power usage with intel+nvidia reverse PRIME

In case it makes things clearer ...

Without xorg.conf:

$ xrandr | grep connected
eDP-1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm
DP-1 disconnected (normal left inverted right x axis y axis)
HDMI-1 disconnected 2560x1440+1920+0 (normal left inverted right x axis y axis) 0mm x 0mm
DP-2 disconnected (normal left inverted right x axis y axis)
HDMI-2 disconnected (normal left inverted right x axis y axis)
DP-3 disconnected (normal left inverted right x axis y axis)
HDMI-3 disconnected (normal left inverted right x axis y axis)
DP-1-0 disconnected (normal left inverted right x axis y axis)
DP-1-1 connected primary 2560x1440+1920+0 (normal left inverted right x axis y axis) 527mm x 296mm
DP-1-2 disconnected (normal left inverted right x axis y axis)
DP-1-3 disconnected (normal left inverted right x axis y axis)
DP-1-4 disconnected (normal left inverted right x axis y axis)
DP-1-5 disconnected (normal left inverted right x axis y axis)

With xorg.conf:

$ DISPLAY=:0.0 xrandr | grep connected
eDP-1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm
DP-1 disconnected (normal left inverted right x axis y axis)
DP-2 disconnected (normal left inverted right x axis y axis)
HDMI-2 disconnected (normal left inverted right x axis y axis)
DP-3 disconnected (normal left inverted right x axis y axis)
HDMI-3 disconnected (normal left inverted right x axis y axis)

$ DISPLAY=:0.1 xrandr | grep connected
DP-0 disconnected (normal left inverted right x axis y axis)
DP-1 connected primary 2560x1440+0+0 (normal left inverted right x axis y axis) 527mm x 296mm
DP-2 disconnected (normal left inverted right x axis y axis)
DP-3 disconnected (normal left inverted right x axis y axis)
DP-4 disconnected (normal left inverted right x axis y axis)
DP-5 disconnected (normal left inverted right x axis y axis)

Offline

#7 2021-01-03 09:48:41

seth
Member
Registered: 2012-09-03
Posts: 49,946

Re: High idle CPU/power usage with intel+nvidia reverse PRIME

Let's call that a "legacy" setup ;-)
(You've two independent displays/screens on the same server, one per GPU. Notably you should NOT be able to control them w/ the same WM or move windows among them)

How do you setup the reverse prime? "xrandr --setprovideroutputsource NVIDIA-G0 modesetting"?

How's the CPU/power behavior when running on the nvidia chip and abusing the intel one as ctrc for the eDP?
https://wiki.archlinux.org/index.php/PR … rimary_GPU

Online

#8 2021-01-03 10:39:49

jlindgren
Member
Registered: 2011-02-27
Posts: 256

Re: High idle CPU/power usage with intel+nvidia reverse PRIME

seth wrote:

Let's call that a "legacy" setup ;-)
(You've two independent displays/screens on the same server, one per GPU. Notably you should NOT be able to control them w/ the same WM or move windows among them)

Yes, exactly. "Legacy" is more or less what I meant by "traditional" smile

seth wrote:

How do you setup the reverse prime? "xrandr --setprovideroutputsource NVIDIA-G0 modesetting"?

X.org sets up reverse PRIME automatically due to the patch I linked earlier (the "AutoBindGPU" option).

seth wrote:

How's the CPU/power behavior when running on the nvidia chip and abusing the intel one as ctrc for the eDP?
https://wiki.archlinux.org/index.php/PR … rimary_GPU

Which would be "PRIME" rather than "reverse PRIME"? I haven't tried that yet but it would interesting, if only for another data point.
I will try it when I get some more time to poke at this.

At the moment, I am testing the "discrete only" option in the BIOS, which disables the intel card completely. So the nvidia card is driving both the internal panel and HDMI directly.
CPU usage at idle is good (<1%) and events/s is low. Power is around 11 W (so 1-2 W higher than my "legacy" configuration), which I can live with.
I'm thinking this is my best option for the present.

My takeaway from this is that hybrid graphics are still more of a headache than the (alleged) power savings are worth.
Hopefully in a few years, my next laptop will have a Ryzen/Vega chip, and PRIME will be a thing of the past.

Offline

#9 2021-01-03 11:55:10

Lone_Wolf
Member
From: Netherlands, Europe
Registered: 2005-10-04
Posts: 11,866

Re: High idle CPU/power usage with intel+nvidia reverse PRIME

jlindgren wrote:

I appreciate the effort to help, but it's a complicated setup and I think you're making some conclusions about it that aren't accurate. Therefore I'm not going to respond to the rest of your post until some of the confusion is cleared up and we're on the same page about how things are actually working.

emphasis by me

I doubt that will happen , good luck.
exit stage left.


Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.


(A works at time B)  && (time C > time B ) ≠  (A works at time C)

Offline

#10 2021-01-03 13:36:47

seth
Member
Registered: 2012-09-03
Posts: 49,946

Re: High idle CPU/power usage with intel+nvidia reverse PRIME

the "AutoBindGPU" option

Ah, yeah.
I'd try the (CPU) behavior w/ that and by explicitly redirecting the output source w/ xrandr to make sure the CPU overhead isn't coming from the autobinding.

Switching to the dedicated GPU in the BIOS will typically be better than "perma-priming" - in any event, you won't be able to completely power down the nvidia GPU (cause the crtc is required) so increased power consumption is to be expected, but the higher CPU load looks "wrong".

Online

#11 2021-05-04 19:40:19

JoeLau
Member
Registered: 2021-05-04
Posts: 1

Re: High idle CPU/power usage with intel+nvidia reverse PRIME

jlindgren wrote:

This setup (reverse PRIME) would be great except that it seems to have a significant performance impact.

I'm here to say: I'm with you on that one and I believe you. I seem to have a similar setup with reverse PRIME and similar symptoms, although I'm on Debian. I also filed a bug report there which is unanswered but explains my situation: https://bugs.debian.org/cgi-bin/bugrepo … bug=976380.

This recent merge in the xorg-server might fix syncing: https://gitlab.freedesktop.org/xorg/xse … quests/460. At least it seems to fix the 1fps bug when disabling the primary screen. Who knows when these changes get released, so I might go and try to build the server from source. Any news on your end?

Offline

Board footer

Powered by FluxBB