You are not logged in.

#1 2014-03-30 12:27:11

Menabrea
Member
Registered: 2014-03-30
Posts: 16

[solved]Bad performance with GT630M 2GB using bumblebee on Lenovo E530

Hi everybody,
i have a lenovo e530 laptop with the following specs:

Model: Lenovo E530
Processor: i73612QM (4 core, 8 threads)
RAM: 4GB 1600Mhz
Storage: 240GB msata SSD + 750GB WD 5400rpm
Video: GT630M 2GB + Intel HD4000

I'm running archlinux 64, with the latest kernel, using systemd ad gnome with gdm.
I'm running latest version in repo of nvidia propietary drivers.
The problem is that, when the laptop used to have windows games like Team Fortress2 run smoothly even with the integrated Intel card, while with archlinux performance seems to drastically drop.
The system is fully encrypted, but the system files and games files are on an ssd and the processor support aes-ni, so this should not be an issue, also because on a desktop pc i have an encrypted 7200rpm harddrive and Team Fortress run without problems with high options.

I've tried benchmarking with https://aur.archlinux.org/packages/unig … setlang=it and even setting everything to minimum i can't get more than 10/11 fps.

Thanks in advance

Last edited by Menabrea (2014-04-10 19:25:17)

Offline

#2 2014-03-30 12:56:15

Menabrea
Member
Registered: 2014-03-30
Posts: 16

Re: [solved]Bad performance with GT630M 2GB using bumblebee on Lenovo E530

[user@box ~]$ glxspheres64 
Polygons in scene: 62464
Visual ID of window: 0x20
Context is Direct
OpenGL Renderer: Mesa DRI Intel(R) Ivybridge Mobile 
60.914111 frames/sec - 62.919403 Mpixels/sec
62.146970 frames/sec - 75.284601 Mpixels/sec
59.740275 frames/sec - 61.706925 Mpixels/sec
59.770015 frames/sec - 61.737644 Mpixels/sec
[user@box ~]$ primusrun glxspheres64 
Polygons in scene: 62464
Visual ID of window: 0x20
Context is Direct
OpenGL Renderer: GeForce GT 630M/PCIe/SSE2
61.974035 frames/sec - 64.014220 Mpixels/sec
59.716084 frames/sec - 61.681938 Mpixels/sec
59.690596 frames/sec - 61.655611 Mpixels/sec
59.745466 frames/sec - 61.712287 Mpixels/sec
[user@box ~]$ optirun glxspheres64 
Polygons in scene: 62464
Visual ID of window: 0x20
Context is Direct
OpenGL Renderer: GeForce GT 630M/PCIe/SSE2
132.862396 frames/sec - 137.236226 Mpixels/sec
136.207049 frames/sec - 140.690985 Mpixels/sec
139.315920 frames/sec - 143.902200 Mpixels/sec
138.516515 frames/sec - 143.076479 Mpixels/sec

optirun compression give me around 30more fps

[user@box ~]$ optirun -c jpeg glxspheres64
Polygons in scene: 62464
Visual ID of window: 0x20
Context is Direct
OpenGL Renderer: GeForce GT 630M/PCIe/SSE2
152.641460 frames/sec - 158.418351 Mpixels/sec
139.982528 frames/sec - 164.335229 Mpixels/sec
164.252015 frames/sec - 169.659191 Mpixels/sec
164.068091 frames/sec - 169.469213 Mpixels/sec
167.000051 frames/sec - 172.497693 Mpixels/sec
170.145718 frames/sec - 175.746915 Mpixels/sec
169.279678 frames/sec - 174.852365 Mpixels/sec
180.511054 frames/sec - 186.453478 Mpixels/sec
189.072162 frames/sec - 195.296417 Mpixels/sec
189.520292 frames/sec - 195.759300 Mpixels/sec
188.339941 frames/sec - 194.540091 Mpixels/sec
188.729256 frames/sec - 194.942223 Mpixels/sec
188.364575 frames/sec - 194.565537 Mpixels/sec
188.023006 frames/sec - 194.212723 Mpixels/sec

Also, while running the test dmesg show this

[11863.198862] bbswitch: enabling discrete graphics
[11863.822897] vgaarb: device changed decodes: PCI:0000:01:00.0,olddecodes=none,decodes=none:owns=none
[11863.823036] [drm] Initialized nvidia-drm 0.0.0 20130102 for 0000:01:00.0 on minor 1
[11863.823040] NVRM: loading NVIDIA UNIX x86_64 Kernel Module  334.21  Thu Feb 27 15:55:45 PST 2014
[11863.911300] nvidia 0000:01:00.0: irq 49 for MSI/MSI-X
[11863.916007] ACPI Warning: \_SB_.PCI0.PEG0.PEGP._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
[11863.916032] ACPI Warning: \_SB_.PCI0.PEG0.PEGP._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
[11863.916042] ACPI Warning: \_SB_.PCI0.PEG0.PEGP._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
[11863.916050] ACPI Warning: \_SB_.PCI0.PEG0.PEGP._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
[11863.916058] ACPI Warning: \_SB_.PCI0.PEG0.PEGP._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
[11863.916066] ACPI Warning: \_SB_.PCI0.PEG0.PEGP._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
[11863.916124] ACPI Warning: \_SB_.PCI0.PEG0.PEGP._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
[11863.916133] ACPI Warning: \_SB_.PCI0.PEG0.PEGP._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
[11867.153279] ACPI Warning: \_SB_.PCI0.PEG0.PEGP._DSM: Argument #4 type mismatch - Found [Buffer], ACPI requires [Package] (20131115/nsarguments-95)
[11877.765758] [drm] Module unloaded
[11877.767304] bbswitch: disabling discrete graphics

Also reading this http://askubuntu.com/questions/220029/o … erformance seems like the drivers itself is designed to be slower than in windows, seriously?

Last edited by Menabrea (2014-03-30 13:30:29)

Offline

#3 2014-04-10 19:27:12

Menabrea
Member
Registered: 2014-03-30
Posts: 16

Re: [solved]Bad performance with GT630M 2GB using bumblebee on Lenovo E530

The solution is actually to rmove proprietary drivers and bumblebee, remove xorg configuration related to the nvidia card, and add nouveau and i915 as modules on mkinitcpio.conf, The run desired application with DRI_PRIME=1 following steps in the wiki https://wiki.archlinux.org/index.php/PRIME

Further details are avaiable in this topic: https://bbs.archlinux.org/viewtopic.php?id=179101

Offline

Board footer

Powered by FluxBB