You are not logged in.

#1 2008-11-01 02:36:33

gen2
Member
Registered: 2006-09-07
Posts: 32

xinerama + nvidia + user login = bad performance

I recently installed two more monitors in my system to make 4 screens. 2x 19" Samsung 930BF, 1x 22" BenQ E2200HD, 1x 17" BenQ FP737s.

I have setup xorg with xinerama with the following settings:

http://htmlup.pawerty.com/index.php?pag … JPQ8J.conf

Hence the physical layout is:

19" + 22" + 19" + 17".

Using two Nvidia 7800GTX graphics crards.
System specs are:
-Dual core Dual Opeteron system
-6GB ram
-Arch64
-All packages up to date.
-Using Gnome+Openbox

The problem I'm having is overall bad performance. The xorg process takes a huge amount of CPU resources to do things. Even just using firefox or evolution is slow. You can actually see it struggling to render the screen contents. Using top, you can see the Xorg process taking 100% cpu just to scroll down a html page.

I have looked into whether hardware acceleration was diabled but glxinfo says its enabled:

#glxinfo
name of display: :0.0
display: :0  screen: 0
direct rendering: Yes

Also from Xorg.0.log

(II) Loading extension NV-GLX
(II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
(II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
(==) NVIDIA(0): Backing store disabled
(==) NVIDIA(0): Silken mouse enabled

The other issue I seem to have is when mplayer or VLC or pretty much any video program boots up, the mouse stops working. I suspect its due to the program initializing some sort of direct render or opengl extension thats causing the mouse to stop being responsive. I must stress that everything else works find, just that the mouse stops.

It also seems that playing the movies doesn't cause xorg to go up in arms with lots of cpu usage.


I think I should also note that by running mplayer or VLC as root, the mouse does not have any problems. Which seems odd as it indicates running as root allows access to some API that doesn't stall the mouse anymore.

I hope this is enough information for someone to give me either a solution or an inkling as to where my problem lies.

Thx

Offline

#2 2008-11-01 05:13:34

gen2
Member
Registered: 2006-09-07
Posts: 32

Re: xinerama + nvidia + user login = bad performance

i seem to have solved the mplayer/vlc problem. No idea what the cause was. It just started working after a reboot.

In anycase, xinerama is still killing the gui performance. Firefox, evolution, pretty much everything is laggy.

Offline

#3 2008-12-05 05:38:01

dmz
Member
From: Sweden
Registered: 2008-08-27
Posts: 881
Website

Re: xinerama + nvidia + user login = bad performance

My setup is 2 x 22" TFT's, I'm running awesome, and when I turn on xcompmgr I get pretty much the same bad performance you're talking about...

Edit: Sorry, didn't notice the thread was that old.

Last edited by dmz (2008-12-05 05:38:25)

Offline

#4 2008-12-21 11:32:53

Super Jamie
Member
From: Brisbane, AU
Registered: 2008-12-15
Posts: 79
Website

Re: xinerama + nvidia + user login = bad performance

Xinerama does all the "joining" of screens in software, so performance will be pretty bad, and the xorg team isn't focusing on improving it at the moment.

If you think it's slow now, try it with xorg from about 12 months ago, it's actually miles better than what it used to be!

Backup your xorg.conf and use the nvidia-settings tool to make a desktop with Xinerama off, and TwinView on. HOWEVER, I'm pretty sure you HAVE to run Xinerama when spanning screens over multiple nVidia cards (this was the case with my 1x DualHead 5200, and 1x SingleHead 5200) and when doing that, the "maximize" command will take one "xorg screen" (two monitors) to be one "maximize area".

This happens because the nVidia driver doesn't report Xinerama screen areas properly, and last time I tried, the TwinViewXineramaInfoOverride option in the driver didn't work properly. I bitched to nVidia about this on the nvnews forums, and they blew me off claiming it's not their fault.

Some dudes on a RHEL forum have written and maintained a "Fake Xinerama" patch to libXinerama, where you specify your screen areas in a config file, and xorg parses that as it loads, so maximize works properly. Google around for it. However, an update to libXinerama about a year ago broke this for me, so I gave up using three screens altogether.

The moral of this story is: closed drivers suck.

And also, that unless you're a Windows gamer spending $500 every 6 months on hardware, nVidia doesn't give a shit about you. My next card will be an ATI. </rant> tongue

Last edited by Super Jamie (2008-12-21 11:34:26)

Offline

Board footer

Powered by FluxBB