You are not logged in.

#1 2017-02-01 11:15:20

bhatzlehah
Member
Registered: 2017-02-01
Posts: 1

Laptop running Bumblebee overheats when Discrete GPU is active

I have an Optimus laptop (integrated Intel chip with a discrete GTX 965M).

My main use of this laptop is for developing CUDA programs, so I need to run the nvidia driver. The main problem is whenever a CUDA or intensive graphics program is started, the GPU quickly climbs up to critical temperatures and Arch shuts down to save itself.

Before I go any further I'll point out that I'm aware that it could be a hardware issue. It's a fairly new laptop (bought around 6 months ago) and I won't rule out a manufacturing fault. I just want to see if this is a software problem before having to pack up the laptop and send it back for a return. Moving on...

X is running using the integrated chip, meaning the discrete card is off when Arch boots.

I've followed the instructions at (Bumblee - ArchWiki) to get Bumblebee working. I can confirm that it's running as expected.
Bumblebee is configured to load the 'nvidia' driver and the xorg.conf.nvidia looks like this:

Section "ServerLayout"
    Identifier  "Layout0"
    Option      "AutoAddDevices" "false"
    Option      "AutoAddGPU" "false"
EndSection

Section "Device"
    Identifier  "DiscreteNvidia"
    Driver      "nvidia"
    VendorName  "NVIDIA Corporation"
    BusID "PCI:01:00:0"
    Option "ProbeAllGpus" "false"
    Option "NoLogo" "true"
    Option "UseEDID" "false"
    Option "UseDisplayDevice" "none"
    Option "RegistryDwords" "PowerMizerEnable=0x1; PerfLevelSrc=0x3333; PowerMizerDefault=0x2; PowerMizerDefaultAC=0x2
EndSection

If I start up nvidia-settings while the rest of the system is idle with

optirun -b none nvidia-settings -c :8

and watch the PowerMizer tab. After about a minute it goes down to level 0. If I watch the thermal settings tab, the temperature steadily climbs up to about 70C and floats around there. Current nvidia driver is version 375.26.

I have some CUDA programs which I've written, which I invoke with

optirun --no-xorg ./executable

. These are pretty process-intensive programs; it takes probably less than a minute before I can hear the GPU fan screaming. If I let the process run too long, the GPU will eventually reach up to 100C and then Arch will shut down. Killing the process and quickly running something else with optirun (like "optirun glxinfo") will switch off the discrete card and prevent the overheat.

I've tried testing this out on a couple of games just to get some more data. If I launch the game "Civilization V" with

primusrun %executable%

it runs just fine. I've been able to play this game for hours without any hint of overheating. However, I also have the game "City Skylines" which I invoke the exact same way. This game barely makes it past the main menu before the GPU has had it. I'm guessing that City Skylines is a lot more demanding on the GPU than Civ V.

Any advice on where else to look would be fantastic. I'm not really sure what else to try; I would like to exhaust all opportunities before having to send this laptop away.

Offline

#2 2017-11-12 09:31:56

telgar
Member
Registered: 2017-09-15
Posts: 4

Re: Laptop running Bumblebee overheats when Discrete GPU is active

I have a newer laptop with a GTX 1050, and I am getting issues similar to what you describe.  Did you ever figure anything out? For me, both the GPU and the CPU temps skyrocket, though particularly the CPU.  On certain games, my whole laptop just force shuts down, probably due to a thermal protection system.  I am using Bumblebee as well, with nvidia proprietary drivers.

Offline

Board footer

Powered by FluxBB