You are not logged in.
Pages: 1
My laptop: Xiaomi Mi Notebook Air 13.3" 2018 (Intel Core i5 8250U/8GB/256GB SSD/NVIDIA GeForce MX150).
I have this results for glxspheres64:
vblank_mode=0 glxspheres64
Polygons in scene: 62464 (61 spheres * 1024 polys/spheres)
ATTENTION: default value of option vblank_mode overridden by environment.
Visual ID of window: 0x14e
Context is Direct
OpenGL Renderer: Mesa DRI Intel(R) UHD Graphics 620 (Kabylake GT2)
152.973603 frames/sec - 170.718541 Mpixels/sec
124.767454 frames/sec - 139.240478 Mpixels/sec
119.714664 frames/sec - 133.601565 Mpixels/sec
128.822398 frames/sec - 143.765797 Mpixels/sec
vblank_mode=0 optirun glxspheres64
Polygons in scene: 62464 (61 spheres * 1024 polys/spheres)
ATTENTION: default value of option vblank_mode overridden by environment.
Visual ID of window: 0x21
Context is Direct
OpenGL Renderer: GeForce MX150/PCIe/SSE2
110.747970 frames/sec - 123.594735 Mpixels/sec
118.968643 frames/sec - 132.769006 Mpixels/sec
115.905409 frames/sec - 129.350436 Mpixels/sec
115.568935 frames/sec - 128.974931 Mpixels/sec
vblank_mode=0 primusrun glxspheres64
Polygons in scene: 62464 (61 spheres * 1024 polys/spheres)
ATTENTION: default value of option vblank_mode overridden by environment.
Visual ID of window: 0x14e
Context is Direct
OpenGL Renderer: GeForce MX150/PCIe/SSE2
ATTENTION: default value of option vblank_mode overridden by environment.
106.197117 frames/sec - 118.515983 Mpixels/sec
106.801868 frames/sec - 119.190885 Mpixels/sec
108.253363 frames/sec - 120.810753 Mpixels/sec
107.919504 frames/sec - 120.438166 Mpixels/sec
In game (dota 2) fps same(Intel faster then Nvidia).
After removing Bumblebee and using nvidia prime:
vblank_mode=0 glxspheres64
Polygons in scene: 62464 (61 spheres * 1024 polys/spheres)
Visual ID of window: 0xce
Context is Direct
OpenGL Renderer: GeForce MX150/PCIe/SSE2
701.405098 frames/sec - 782.768089 Mpixels/sec
541.940075 frames/sec - 604.805123 Mpixels/sec
537.164791 frames/sec - 599.475907 Mpixels/sec
535.097494 frames/sec - 597.168803 Mpixels/sec
538.172488 frames/sec - 600.600497 Mpixels/sec
In game the result is several times greater then with opti/primus run.
How can I fix it and get better performance with bumblebee?
Last edited by stoope (2018-11-14 17:22:34)
Offline
Please use code tags when pasting to the boards: https://wiki.archlinux.org/index.php/Co … s_and_code
Offline
I have a different setup, use different benchmarking, but the results seem to be the same...
My hardware: Asus PU551LD (Intel Core i5 4210U, NVidia GeForce 820M)
My software:
$ pacman -Q linux bumblebee nvidia-390xx nvidia-390xx-utils nvidia-settings nexuiz
linux 4.19.2.arch1-1
bumblebee 3.2.1-20
nvidia-390xx 390.87-12
nvidia-390xx-utils 390.87-1
nvidia-settings 415.18-1
nexuiz 2.5.2-8
This is a fairly fresh system, installed a month ago. For most of the stuff I use default settings, unless otherwise noted. Here are the results:
# Intel
$ nexuiz-glx -benchmark demos/demo3 -nosound | grep frames
525 frames 8.8262651 seconds 59.4815581 fps, one-second fps min/avg/max: 53 60 60 (15 seconds)
# NVidia, pure optirun
$ optirun sh -c "nvidia-settings -c :8 && nexuiz-glx -benchmark demos/demo3 -nosound" | grep frames
525 frames 15.8850060 seconds 33.0500348 fps, one-second fps min/avg/max: 31 33 35 (15 seconds)
# NVidia, optirun with primus
$ optirun -b primus sh -c "nvidia-settings -c :8 && nexuiz-glx -benchmark demos/demo3 -nosound" | grep frames
525 frames 10.8009880 seconds 48.6066647 fps, one-second fps min/avg/max: 48 49 49 (15 seconds)
Notes:
nvidia-settings is run before nexuiz to change PowerMizer settings. In my case it has no impact on performance
vblank_mode=0 does not impact performance
setting cpu_governor to performance does not impact performance
Nexuiz graphics settings: normal, 1920x1080
I will also test nvidia-xrun and Optimus and post the results here.
Offline
I have tested nexuiz with nvidia-xrun and now I have 70FPS.
One thing is interesting though: when benchmarking using Half-Life 2: Lost Coast (available on Steam for free) the results were the same - 44fps in my case - either by using pure Steam (without explicit optirun) or with nvidia-xrun. This is very confusing, because it's hard to tell which card was actually used...
Last edited by madman_xxx (2018-12-19 20:45:36)
Offline
Pages: 1