You are not logged in.
I have a problem detecting my GPU temperature. I have a Lenovo Thinkpad T470p which has a NVIDIA® GeForce® 940MX 2GB GDDR5.
Using sensors the output is:
iwlwifi-virtual-0
Adapter: Virtual device
temp1: +38.0°C
thinkpad-isa-0000
Adapter: ISA adapter
fan1: 2290 RPM
pch_skylake-virtual-0
Adapter: Virtual device
temp1: +39.5°C
nouveau-pci-0200
Adapter: PCI adapter
GPU core: +0.60 V (min = +0.60 V, max = +1.20 V)
coretemp-isa-0000
Adapter: ISA adapter
Package id 0: +40.0°C (high = +100.0°C, crit = +100.0°C)
Core 0: +43.0°C (high = +100.0°C, crit = +100.0°C)
Core 1: +53.0°C (high = +100.0°C, crit = +100.0°C)
Core 2: +41.0°C (high = +100.0°C, crit = +100.0°C)
Core 3: +41.0°C (high = +100.0°C, crit = +100.0°C)
acpitz-virtual-0
Adapter: Virtual device
temp1: +40.0°C (crit = +128.0°C)
The fields regarding my GPU are the one under "nouveau-pci-0200", but as you can observe there is no temperature there, only the voltage.
Someone can tell me why this happens and if there is a way to obtain the temperature of my gpu? I also tried to use nvidia-settings and nvidia-smi, but the don't run since i'm not using nvidia drivers.
Thanks in advance.
Offline