You are not logged in.

#1 2015-01-19 16:17:12

artem.jackson
Member
Registered: 2014-09-06
Posts: 22

[SOLVED] Questionable performance of Bumblebee

I used well-known benchmark Unigine Heaven.
Hadrware:

$ lscpu | grep name
Model name:            Intel(R) Core(TM) i7-3517U CPU @ 1.90GHz
$ lspci | grep -E "VGA|3D"
00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09)
01:00.0 3D controller: NVIDIA Corporation GF117M [GeForce 610M/710M/820M / GT 620M/625M/630M/720M] (rev ff)

Tests:
1.

$ unigine-heaven
Renderer: INTEL Unknown 256MB
OpenGL vendor:   Intel Open Source Technology Center
OpenGL renderer: Mesa DRI Intel(R) Ivybridge Mobile
OpenGL version:  3.2 (Core Profile) Mesa 10.4.2
OpenGL flags:    Core Profile

min FPS: 7.8
max FPS: 21.7

2.

$optirun unigine-heaven
Renderer: NVIDIA NV70 (Kepler) 256MB
OpenGL vendor:   NVIDIA Corporation
OpenGL renderer: GeForce GT 620M/PCIe/SSE2
OpenGL version:  3.2.0 NVIDIA 346.35
OpenGL flags:    Core Profile

min FPS: 8.5
max FPS: 9.2

3.

$primusrun unigine-heaven
Renderer: NVIDIA NV70 (Kepler) 256MB
OpenGL vendor:   NVIDIA Corporation
OpenGL renderer: GeForce GT 620M/PCIe/SSE2
OpenGL version:  3.2.0 NVIDIA 346.35
OpenGL flags:    Core Profile

min FPS: 9.9
max FPS: 10.9

So as you can see I got more FPS without using Bumblebee.
And main question why so?

Last edited by artem.jackson (2015-01-20 01:28:59)

Offline

#2 2015-01-19 22:21:31

amonakov
Member
Registered: 2012-09-30
Posts: 32

Re: [SOLVED] Questionable performance of Bumblebee

As I recall, Heaven benchmark enables tessellation by default, so unless you've taken measures to disable it, your GeForce card is loaded more heavily than your Intel GPU (where tessellation is not supported by the Linux driver); and tessellation is far from "free" on this benchmark.If you pay attention to the actual picture rendered by the benchmark, you should see the difference.

On top of that, your GeForce is at the low end of the spectrum; in terms of raw peak performance (memory bandwidth and gflops) it's comparable to your Intel HD4000, so the difference in performance would be due to better architectural efficiency and better drivers (which I'd expect to be "a lot", given that Nvidia has more years of experience of tuning those, compared to Intel).

Offline

#3 2015-01-19 22:29:21

artem.jackson
Member
Registered: 2014-09-06
Posts: 22

Re: [SOLVED] Questionable performance of Bumblebee

Thanks for the informative answer!
But tessellation was disabled in these tests.

Last edited by artem.jackson (2015-01-19 23:00:31)

Offline

#4 2015-01-20 01:28:35

artem.jackson
Member
Registered: 2014-09-06
Posts: 22

Re: [SOLVED] Questionable performance of Bumblebee

If somebody interesting in this you can follow this topic
https://github.com/Bumblebee-Project/Bu … issues/626
marked as solved.

Last edited by artem.jackson (2015-01-20 01:29:18)

Offline

Board footer

Powered by FluxBB