You are not logged in.

#1 2022-05-23 15:59:23

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Graphic card gets hot

My GPU is RTX 3080Ti.

Previously I have two monitors: 4k@60hz + 2k@165hz, and the graphic card is just cold almost anytime, the fans on it don't even work (except when I play 3D games).
But today I add the third monitor: 1080p@60hz, it starts getting hot and the fan start working... at first I think it is the third monitor adds more works to the GPU, so I turn the 2k@165hz monitor to 60hz, but I found that it does not helps, the fan is still working and graphic card still gets hot?

why is that, I think turn 2k@165hz to 60hz should reduce much more work than adding an 1080p@60hz, so it should be better than when I just have 2 monitors, isn't it?

Offline

#2 2022-05-23 16:45:07

V1del
Forum Moderator
Registered: 2012-10-16
Posts: 21,671

Re: Graphic card gets hot

It's generally normal that cards tend to clock higher by default with more monitors, what does "gets hot" actually mean in verifiable numbers? Output of nvidia-smi ?

Online

#3 2022-05-23 16:59:36

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

V1del wrote:

It's generally normal that cards tend to clock higher by default with more monitors, what does "gets hot" actually mean in verifiable numbers? Output of nvidia-smi ?

It is about 54-56C

Mon May 23 18:00:05 2022       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 515.43.04    Driver Version: 515.43.04    CUDA Version: 11.7     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  Off  | 00000000:01:00.0  On |                  N/A |
| 35%   54C    P0    91W / 350W |   1113MiB / 12288MiB |      7%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A       619      G   /usr/lib/Xorg                     534MiB |
|    0   N/A  N/A       765      G   /usr/bin/kwin_x11                 176MiB |
|    0   N/A  N/A       826      G   /usr/bin/plasmashell              159MiB |
|    0   N/A  N/A      1126      G   ...akonadi_archivemail_agent        4MiB |
|    0   N/A  N/A      1136      G   .../akonadi_mailfilter_agent        4MiB |
|    0   N/A  N/A      1141      G   ...n/akonadi_sendlater_agent        4MiB |
|    0   N/A  N/A      1142      G   ...nadi_unifiedmailbox_agent        4MiB |
|    0   N/A  N/A      3215      G   /usr/lib/firefox/firefox          164MiB |
+-----------------------------------------------------------------------------+

Last edited by yuanhao (2022-05-23 17:00:36)

Offline

#4 2022-05-23 18:38:43

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

Multihead layouts will make the nvidia driver activating the full composition pipeline - do you get similar effects w/ two outputs but w/ ForceFullCompositionPipeline enabled?
https://wiki.archlinux.org/title/NVIDIA … en_tearing

Offline

#5 2022-05-23 22:31:51

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:

Multihead layouts will make the nvidia driver activating the full composition pipeline - do you get similar effects w/ two outputs but w/ ForceFullCompositionPipeline enabled?
https://wiki.archlinux.org/title/NVIDIA … en_tearing

I have checked in nvidia-settings, neither for two outputs nor three outputs enables ForceFullCompositionPipeline, so it might not be the reason?
for nvidia-smi, this is two monitor output:

Mon May 23 23:16:19 2022       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 515.43.04    Driver Version: 515.43.04    CUDA Version: 11.7     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  Off  | 00000000:01:00.0  On |                  N/A |
| 35%   36C    P5    31W / 350W |    738MiB / 12288MiB |      8%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A       614      G   /usr/lib/Xorg                     399MiB |
|    0   N/A  N/A       762      G   /usr/bin/kwin_x11                 139MiB |
|    0   N/A  N/A       815      G   /usr/bin/plasmashell              144MiB |
|    0   N/A  N/A      1136      G   ...akonadi_archivemail_agent        4MiB |
|    0   N/A  N/A      1151      G   .../akonadi_mailfilter_agent        4MiB |
|    0   N/A  N/A      1173      G   ...n/akonadi_sendlater_agent        4MiB |
|    0   N/A  N/A      1176      G   ...nadi_unifiedmailbox_agent        4MiB |
+-----------------------------------------------------------------------------+

and this is 3 monitors output:

Mon May 23 23:28:49 2022       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 515.43.04    Driver Version: 515.43.04    CUDA Version: 11.7     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  Off  | 00000000:01:00.0  On |                  N/A |
| 35%   53C    P0    92W / 350W |   1208MiB / 12288MiB |     12%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A       614      G   /usr/lib/Xorg                     493MiB |
|    0   N/A  N/A       762      G   /usr/bin/kwin_x11                 168MiB |
|    0   N/A  N/A       815      G   /usr/bin/plasmashell              157MiB |
|    0   N/A  N/A      1136      G   ...akonadi_archivemail_agent        4MiB |
|    0   N/A  N/A      1151      G   .../akonadi_mailfilter_agent       56MiB |
|    0   N/A  N/A      1173      G   ...n/akonadi_sendlater_agent        4MiB |
|    0   N/A  N/A      1176      G   ...nadi_unifiedmailbox_agent        4MiB |
|    0   N/A  N/A      1646      G   /usr/lib/firefox/firefox          196MiB |
+-----------------------------------------------------------------------------+

the change is that the power is increased from 31W to 92W and temp increased from 36C to 53C, and also, the Perf turns from p5 to p0.

Is it normal for just adding an 1080p@60...

Offline

#6 2022-05-23 23:25:47

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

The driver doesn't activate that setting but the behavior that can be enforced by it - you're therefore supposed to test whether activating the setting (and enforcing the full composition pipeline) on the dualhead causes the same behavior that you see on the triple head.

Offline

#7 2022-05-24 10:03:31

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:

The driver doesn't activate that setting but the behavior that can be enforced by it - you're therefore supposed to test whether activating the setting (and enforcing the full composition pipeline) on the dualhead causes the same behavior that you see on the triple head.

I enabled  full composition pipeline under 2 monitors, and nvidia-smi output like this:

Tue May 24 11:00:14 2022       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 515.43.04    Driver Version: 515.43.04    CUDA Version: 11.7     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  Off  | 00000000:01:00.0  On |                  N/A |
| 35%   37C    P5    32W / 350W |   1079MiB / 12288MiB |     20%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A       608      G   /usr/lib/Xorg                     483MiB |
|    0   N/A  N/A       756      G   /usr/bin/kwin_x11                 158MiB |
|    0   N/A  N/A       842      G   /usr/bin/plasmashell              144MiB |
|    0   N/A  N/A      1126      G   ...akonadi_archivemail_agent        4MiB |
|    0   N/A  N/A      1134      G   .../akonadi_mailfilter_agent        4MiB |
|    0   N/A  N/A      1139      G   ...n/akonadi_sendlater_agent        4MiB |
|    0   N/A  N/A      1140      G   ...nadi_unifiedmailbox_agent        4MiB |
|    0   N/A  N/A      1573      G   nvidia-settings                     0MiB |
|    0   N/A  N/A      1615      G   /usr/lib/firefox/firefox          172MiB |
+-----------------------------------------------------------------------------+

the temp and power just increased a little bit, but the Volatile GPU-Util is lager than ever.

Offline

#8 2022-05-24 12:39:55

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

What if you set all three outputs in semi-clone mode (ie. have them all overlapping start at 0,0)?

Offline

#9 2022-05-24 13:48:31

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:

What if you set all three outputs in semi-clone mode (ie. have them all overlapping start at 0,0)?

If I stack those three all to (0,0), nothing changed in nvidia-smi.

But I try to make 1080p@60hz to 50hz, then the power decreased about 30W (from more than 90W to 64W)..... and temp reduced from about 55C to 48C...

Offline

#10 2022-05-24 14:07:06

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

With the 2k at 165Hz or at 60Hz?
What'S the impact of sudpending the kwin compositor (SHIFT+Alt+F12)?

Offline

#11 2022-05-24 14:28:42

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:

With the 2k at 165Hz or at 60Hz?
What'S the impact of sudpending the kwin compositor (SHIFT+Alt+F12)?

With 2k@60hz (but the previous 91W also with 60hz).
Pressing SHIFT+Alt+F12 it suddenly jump to 101W and flash then come to normal (as it not been pressed)

Offline

#12 2022-05-25 06:29:25

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

You're not in persistence mode but the GPU doesn't get out of P0…
Check the powermizer setting in nvidia-settings and try to set the preferred mode to adaptive (it's likely? auto and the current mode is performance?)

Offline

#13 2022-05-25 10:22:21

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:

You're not in persistence mode but the GPU doesn't get out of P0…
Check the powermizer setting in nvidia-settings and try to set the preferred mode to adaptive (it's likely? auto and the current mode is performance?)

I just checked, it is auto and the current mode is adaptive.

Also, I just tried Windows, it might be helpful (cause I intalled on another ssd on the desktop), with 4k@60+2k@60+1080p@60, nvidia-smi outputs like this:

Wed May 25 11:07:20 2022
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 512.77       Driver Version: 512.77       CUDA Version: 11.6     |
|-------------------------------+----------------------+----------------------+
| GPU  Name            TCC/WDDM | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ... WDDM  | 00000000:01:00.0  On |                  N/A |
|  0%   34C    P8    27W / 350W |    857MiB / 12288MiB |      1%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A       892    C+G   ...ekyb3d8bbwe\YourPhone.exe    N/A      |
|    0   N/A  N/A      6644    C+G   ...artMenuExperienceHost.exe    N/A      |
|    0   N/A  N/A      6992    C+G   ...batNotificationClient.exe    N/A      |
|    0   N/A  N/A      8728    C+G   C:\Windows\explorer.exe         N/A      |
|    0   N/A  N/A     10264    C+G   ...n1h2txyewy\SearchHost.exe    N/A      |
|    0   N/A  N/A     10616    C+G   ...ram Files\LGHUB\lghub.exe    N/A      |
|    0   N/A  N/A     11728    C+G   ...y\ShellExperienceHost.exe    N/A      |
|    0   N/A  N/A     12404    C+G   ...cw5n1h2txyewy\LockApp.exe    N/A      |
|    0   N/A  N/A     14568    C+G   ...8bbwe\WindowsTerminal.exe    N/A      |
|    0   N/A  N/A     15328    C+G   ...2txyewy\TextInputHost.exe    N/A      |
|    0   N/A  N/A     15408    C+G   ...perience\NVIDIA Share.exe    N/A      |
|    0   N/A  N/A     17168    C+G   ...bin\jetbrains-toolbox.exe    N/A      |
|    0   N/A  N/A     17184    C+G   ...zilla Firefox\firefox.exe    N/A      |
|    0   N/A  N/A     20216    C+G   ...210.47\msedgewebview2.exe    N/A      |
|    0   N/A  N/A     22116    C+G   ...obeNotificationClient.exe    N/A      |
+-----------------------------------------------------------------------------+

the power and temp and fan usage is significantly lower than it is on arch..

Just have a summary for quick check and compare to above: on arch:
4k@60+2k@60+1080p@60 (same as above on windows)

Mon May 23 23:28:49 2022       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 515.43.04    Driver Version: 515.43.04    CUDA Version: 11.7     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  Off  | 00000000:01:00.0  On |                  N/A |
| 35%   53C    P0    92W / 350W |   1208MiB / 12288MiB |     12%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A       614      G   /usr/lib/Xorg                     493MiB |
|    0   N/A  N/A       762      G   /usr/bin/kwin_x11                 168MiB |
|    0   N/A  N/A       815      G   /usr/bin/plasmashell              157MiB |
|    0   N/A  N/A      1136      G   ...akonadi_archivemail_agent        4MiB |
|    0   N/A  N/A      1151      G   .../akonadi_mailfilter_agent       56MiB |
|    0   N/A  N/A      1173      G   ...n/akonadi_sendlater_agent        4MiB |
|    0   N/A  N/A      1176      G   ...nadi_unifiedmailbox_agent        4MiB |
|    0   N/A  N/A      1646      G   /usr/lib/firefox/firefox          196MiB |
+-----------------------------------------------------------------------------+

4k@60+2k@60+1080p@50:

Wed May 25 17:26:33 2022       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 515.43.04    Driver Version: 515.43.04    CUDA Version: 11.7     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  Off  | 00000000:01:00.0  On |                  N/A |
| 35%   48C    P3    65W / 350W |   1106MiB / 12288MiB |      6%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A       616      G   /usr/lib/Xorg                     524MiB |
|    0   N/A  N/A       773      G   /usr/bin/kwin_x11                 182MiB |
|    0   N/A  N/A       861      G   /usr/bin/plasmashell              155MiB |
|    0   N/A  N/A      1191      G   ...akonadi_archivemail_agent        4MiB |
|    0   N/A  N/A      1199      G   .../akonadi_mailfilter_agent        4MiB |
|    0   N/A  N/A      1204      G   ...n/akonadi_sendlater_agent        4MiB |
|    0   N/A  N/A      1205      G   ...nadi_unifiedmailbox_agent        4MiB |
|    0   N/A  N/A      1812      G   /usr/lib/firefox/firefox          167MiB |
+-----------------------------------------------------------------------------+

Last edited by yuanhao (2022-05-25 16:27:45)

Offline

#14 2022-05-25 13:39:00

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

Could be a difference in the nvidia driver, could be plasma's bazillion GL contexts that keep the GPU in P5 and prevent P8 (feel free to try an openbox session)
Your main problem is however (apparently, given the data so far) that the chip moves from P5 to P0 and stays there when you add a 3rd output.
=> Try to force the performance mode to adaptive (instead of auto)

Offline

#15 2022-05-25 16:53:39

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:

=> Try to force the performance mode to adaptive (instead of auto)

I changed from auto to adaptive but nothing changed QAQ

Just in case you need, I am attaching ~/.nvidia-settings-rc and /etc/X11/xorg.conf
~/.nvidia-settings-rc:

#
# /home/yuanhao/.nvidia-settings-rc
#
# Configuration file for nvidia-settings - the NVIDIA Settings utility
# Generated on Wed May 25 17:44:11 2022
#

# ConfigProperties:

RcFileLocale = C
DisplayStatusBar = Yes
SliderTextEntries = Yes
IncludeDisplayNameInConfigFile = No
UpdateRulesOnProfileNameChange = Yes
Timer = PowerMizer_Monitor_(GPU_0),Yes,1000
Timer = Thermal_Monitor_(GPU_0),Yes,1000
Timer = Memory_Used_(GPU_0),Yes,3000

# Attributes:

0/SyncToVBlank=1
0/LogAniso=0
0/FSAA=0
0/TextureClamping=1
0/FXAA=0
0/AllowFlipping=1
0/FSAAAppControlled=1
0/LogAnisoAppControlled=1
0/OpenGLImageSettings=1
0/FSAAAppEnhanced=0
0/ShowGraphicsVisualIndicator=0
[DPY:HDMI-0]/RedBrightness=0.000000
[DPY:HDMI-0]/GreenBrightness=0.000000
[DPY:HDMI-0]/BlueBrightness=0.000000
[DPY:HDMI-0]/RedContrast=0.000000
[DPY:HDMI-0]/GreenContrast=0.000000
[DPY:HDMI-0]/BlueContrast=0.000000
[DPY:HDMI-0]/RedGamma=1.000000
[DPY:HDMI-0]/GreenGamma=1.000000
[DPY:HDMI-0]/BlueGamma=1.000000
[DPY:HDMI-0]/Dithering=0
[DPY:HDMI-0]/DitheringMode=0
[DPY:HDMI-0]/DitheringDepth=0
[DPY:HDMI-0]/DigitalVibrance=0
[DPY:HDMI-0]/ColorSpace=0
[DPY:HDMI-0]/ColorRange=1
[DPY:HDMI-0]/SynchronousPaletteUpdates=0
[DPY:DP-0]/Dithering=0
[DPY:DP-0]/DitheringMode=0
[DPY:DP-0]/DitheringDepth=0
[DPY:DP-0]/ColorSpace=0
[DPY:DP-0]/ColorRange=0
[DPY:DP-0]/SynchronousPaletteUpdates=0
[DPY:DP-1]/Dithering=0
[DPY:DP-1]/DitheringMode=0
[DPY:DP-1]/DitheringDepth=0
[DPY:DP-1]/ColorSpace=0
[DPY:DP-1]/ColorRange=0
[DPY:DP-1]/SynchronousPaletteUpdates=0
[DPY:DP-2]/RedBrightness=0.000000
[DPY:DP-2]/GreenBrightness=0.000000
[DPY:DP-2]/BlueBrightness=0.000000
[DPY:DP-2]/RedContrast=0.000000
[DPY:DP-2]/GreenContrast=0.000000
[DPY:DP-2]/BlueContrast=0.000000
[DPY:DP-2]/RedGamma=1.000000
[DPY:DP-2]/GreenGamma=1.000000
[DPY:DP-2]/BlueGamma=1.000000
[DPY:DP-2]/Dithering=0
[DPY:DP-2]/DitheringMode=0
[DPY:DP-2]/DitheringDepth=0
[DPY:DP-2]/DigitalVibrance=0
[DPY:DP-2]/ColorSpace=0
[DPY:DP-2]/ColorRange=1
[DPY:DP-2]/SynchronousPaletteUpdates=0
[DPY:DP-3]/Dithering=0
[DPY:DP-3]/DitheringMode=0
[DPY:DP-3]/DitheringDepth=0
[DPY:DP-3]/ColorSpace=0
[DPY:DP-3]/ColorRange=0
[DPY:DP-3]/SynchronousPaletteUpdates=0
[DPY:DP-4]/RedBrightness=0.000000
[DPY:DP-4]/GreenBrightness=0.000000
[DPY:DP-4]/BlueBrightness=0.000000
[DPY:DP-4]/RedContrast=0.000000
[DPY:DP-4]/GreenContrast=0.000000
[DPY:DP-4]/BlueContrast=0.000000
[DPY:DP-4]/RedGamma=1.000000
[DPY:DP-4]/GreenGamma=1.000000
[DPY:DP-4]/BlueGamma=1.000000
[DPY:DP-4]/Dithering=0
[DPY:DP-4]/DitheringMode=0
[DPY:DP-4]/DitheringDepth=0
[DPY:DP-4]/DigitalVibrance=0
[DPY:DP-4]/ColorSpace=0
[DPY:DP-4]/ColorRange=1
[DPY:DP-4]/SynchronousPaletteUpdates=0
[DPY:DP-5]/Dithering=0
[DPY:DP-5]/DitheringMode=0
[DPY:DP-5]/DitheringDepth=0
[DPY:DP-5]/ColorSpace=0
[DPY:DP-5]/ColorRange=0
[DPY:DP-5]/SynchronousPaletteUpdates=0

/etc/X11/xorg.conf:

# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings:  version 515.43.04

# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 470.74

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0" 0 0
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
    Option         "Xinerama" "0"
EndSection

Section "Files"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Samsung U32J59x"
    HorizSync       30.0 - 135.0
    VertRefresh     30.0 - 75.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "NVIDIA GeForce RTX 3080 Ti"
EndSection

Section "Screen"

# Removed Option "metamodes" "DP-4: nvidia-auto-select +1440+0, DP-2: 2560x1440_144 +0+0 {rotation=left}"
# Removed Option "metamodes" "DP-4: nvidia-auto-select +0+400, DP-2: 2560x1440_120 +3840+0 {rotation=left}"
# Removed Option "metamodes" "DP-4: nvidia-auto-select +1440+0 {ForceCompositionPipeline=On}, DP-2: nvidia-auto-select +0+0 {rotation=left, ForceCompositionPipeline=On}, HDMI-0: nvidia-auto-select +5280+0 {rotation=left, ForceCompositionPipeline=On}"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "Stereo" "0"
    Option         "nvidiaXineramaInfoOrder" "DFP-5"
    Option         "metamodes" "DP-4: nvidia-auto-select +1440+0, DP-2: nvidia-auto-select +0+0 {rotation=left}, HDMI-0: nvidia-auto-select +5280+0 {rotation=left}"
    Option         "SLI" "Off"
    Option         "MultiGPU" "Off"
    Option         "BaseMosaic" "off"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

currently on arch its 4k@60+2k@60+1080p@50, and its fan: 35%, Perf: P3, Power 65W, temp: 47C
(tbh I am ok with one of them 50hz, but compare to before (and windows), when (where) is about 30W and Fan 0%, its a bit noisy and hotter)

Offline

#16 2022-05-25 19:53:23

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

You can try to limt the power demand and see whether that has notable performance impacts (you'll have to unlimit it when you want to play some game and you'll likely also notice it on browsers)

nvidia-smi -pl 35

Next up:
1. do NOT rotate the outputs (any of them!) and check the impact
2. if that's still not doing anything, try to run an openbox session (or startx xterm) and see what the power demand is in that "clean" environment

Offline

#17 2022-05-26 09:44:31

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:

You can try to limt the power demand and see whether that has notable performance impacts (you'll have to unlimit it when you want to play some game and you'll likely also notice it on browsers)

nvidia-smi -pl 35

Next up:
1. do NOT rotate the outputs (any of them!) and check the impact
2. if that's still not doing anything, try to run an openbox session (or startx xterm) and see what the power demand is in that "clean" environment

It outputs like this

[...~]$ nvidia-smi -pl 35
Provided power limit 37.00 W is not a valid power limit which should be between 100.00 W and 400.00 W for GPU 00000000:01:00.0
Terminating early due to previous errors.

Now, I am confused about (under 4k@60+2k@60+1080p@50):

1. the perf shown by nvidia-smi (currently P3)
2. the GPUCurrentPerfLevel by nvidia-settings -q (currently 4)
3. the performance levels shown in the UI of nvidia-settings (it has 0, 1, 2, 3, 4, and is set to 4 when I open the UI and then jump to 2)

Last edited by yuanhao (2022-05-26 09:52:28)

Offline

#18 2022-05-26 12:54:53

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

GPUCurrentPerfLevel is what you can see in the powermizer settings in nvidia-settings, not the P-mode. Lower values typically mean "less power demand" here.

Did you try the behavior w/o output rotation?

Offline

#19 2022-05-26 13:44:19

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:

Did you try the behavior w/o output rotation?

Yes, without rotation the power dropped 3W (from 65W to 62W) and Volatile GPU-Util decreased from 6% to 1%, and nothing else changed.
and the code nvidia-smi -pl 35 still output the same message.

Offline

#20 2022-05-26 13:54:21

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

nvidia-smi -pl 35 still output the same message

This will not change as the driver doesn't allow you to limit the power damand this much.

Since the rotated output isn't relevant, did you try an "startx xterm" session?

Offline

#21 2022-05-26 14:35:06

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:

nvidia-smi -pl 35 still output the same message

Since the rotated output isn't relevant, did you try an "startx xterm" session?

I am not sure what is startx xterm session, but tried the kde/openbox session, here is what I have done:
I install openbox by

sudo pacman -S openbox

then I reboot and in the login menu I choose kde/openboox option,
after I logged in I run nvidia-smi, the output is almost the same with before:

Thu May 26 15:32:48 2022       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 515.43.04    Driver Version: 515.43.04    CUDA Version: 11.7     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  Off  | 00000000:01:00.0  On |                  N/A |
| 35%   48C    P3    66W / 350W |    894MiB / 12288MiB |      6%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A       632      G   /usr/lib/Xorg                     356MiB |
|    0   N/A  N/A       816      G   /usr/bin/plasmashell              238MiB |
|    0   N/A  N/A      1107      G   ...akonadi_archivemail_agent        4MiB |
|    0   N/A  N/A      1125      G   .../akonadi_mailfilter_agent        4MiB |
|    0   N/A  N/A      1133      G   ...n/akonadi_sendlater_agent        4MiB |
|    0   N/A  N/A      1134      G   ...nadi_unifiedmailbox_agent        4MiB |
|    0   N/A  N/A      1498      G   /usr/bin/krunner                   17MiB |
|    0   N/A  N/A      1892      G   /usr/lib/firefox/firefox          205MiB |
+-----------------------------------------------------------------------------+

Offline

#22 2022-05-26 14:37:26

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

Not "KDE/openbox" - no KDE.

https://wiki.archlinux.org/title/Xinit
http://wiki.archlinux.org/index.php/Sys … _boot_into - boot the multi-user.target, login there and run "startx" (make sure you've xterm installed)

You want to know whether the higher power state is triggered by the X11 server itself or somethingelse™ (plasma)

Offline

#23 2022-05-26 14:50:14

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:

Not "KDE/openbox" - no KDE.

https://wiki.archlinux.org/title/Xinit
http://wiki.archlinux.org/index.php/Sys … _boot_into - boot the multi-user.target, login there and run "startx" (make sure you've xterm installed)

You want to know whether the higher power state is triggered by the X11 server itself or somethingelse™ (plasma)

Just to make sure:
Should I run systemctl set-default multi-user.target, then reboot, then log in, then run startx.
And If I want to reverse back, should I run systemctl set-default graphical.target, then reboot?

Offline

#24 2022-05-26 14:51:26

seth
Member
Registered: 2012-09-03
Posts: 51,056

Re: Graphic card gets hot

The wiki wrote:

Alternatively, append one of the following kernel parameters to your bootloader:
systemd.unit=multi-user.target

Offline

#25 2022-05-26 15:04:53

yuanhao
Member
From: Edinburgh, UK
Registered: 2021-11-12
Posts: 68
Website

Re: Graphic card gets hot

seth wrote:
The wiki wrote:

Alternatively, append one of the following kernel parameters to your bootloader:
systemd.unit=multi-user.target

I just did that, and after I logged in and run startx, it is like this https://ibb.co/qBTm317
where each of the white block is a terminal like this https://ibb.co/R4RQ8rh
and none of them is focused.... how can I focus to one of the block and type things?


Edit:

the mouse is extremely tiny, it took me a while to notice it...
anyway, the nvidia-smi output like this:
fan: 35%  Temp: 55C  Perf: P0  Power: 91W/350W Memory-Usage: 124MiB/12288Mib  Volatile GPU-Util: 7%

Last edited by yuanhao (2022-05-26 15:13:22)

Offline

Board footer

Powered by FluxBB