You are not logged in.
I have a system with a Core 2 Duo CPU and a nVidia geForce 7300 graphic card. I am trying to run a 3D robotics simulation program (Webots 7.0.3) and its performance is abysmal. I never had graphic performance issues before---but then again, I never tried to run this particularly intensive simulation software. Other simulation software compiled from source (e.g. player/stage ) runs smoothly.
Where can I start to look to find out where the problem is?
Here is what I've been able to find out so far:
1. Hardware acceleration seems to be enabled:
[stefano@polus]$ glxinfo | grep -n5 "direct rendering"
1-name of display: :0
2-display: :0 screen: 0
3:direct rendering: Yes
4-server glx vendor string: NVIDIA Corporation
5-server glx version string: 1.4
6-server glx extensions:
7- GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
8- GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
2. glxgears reports very poor performance: about 60FPS. The software manufacterer says their product needs around 2000FPS to run smoothly
[stefano@polus]$ glxgears
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
301 frames in 5.0 seconds = 60.157 FPS
299 frames in 5.0 seconds = 59.781 FPS
3. The software (i.e. Webots) is proprietary, and their linux version is built against libjpeg version 6.2. Archlinux currently runs libjpeg 8, so I had to manually install the older version.
4. The software runs satisfactorily on my Kubuntu laptop, an old Lenovo T61 which has similar (or similarly aged) hardware. In other words, the CPU seems to be adequate and the prolem lies with the graphic setup in my Archlinux system, I believe.
5. I read the nvidia pages on the Arch website, the nvidia-linux pages and the Xorg pages on how to set up xorg.conf. It did not help.
Some info on my system are appended below.
Any hint greatly appreciated.
Stefano
Hardware
CPU Version: Intel(R) Core(TM)2 Duo CPU E6550 @ 2.33GHz
Graphic card: nVidia GeForce 7300/Gt/Pci/SS2
RAM: 4 GB
Double monitor setup with Xinerama enabled
SW:
Archlinux 64 bits
Kernel: Linux 3.6.11-1-ARCH
Graphic driver: nvidia304xx
Offline
2. glxgears reports very poor performance: about 60FPS. The software manufacterer says their product needs around 2000FPS to run smoothly
*facepalm*
Double monitor setup with Xinerama enabled
Xinerama has a serious impact on performance. Use TwinView instead.
Offline
2. glxgears reports very poor performance: about 60FPS. The software manufacterer says their product needs around 2000FPS to run smoothly
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
301 frames in 5.0 seconds = 60.157 FPS
Why you get 60FPS? you have your answer there.
"open source is about choice"
No.
Open source is about opening the source code complying with this conditions, period. The ability to choose among several packages is just a nice side effect.
Offline
chris_l is right. Turn off Sync to VBlank in nvidia-settings -> OpenGL Settings to "fix" it.
Offline
stefano wrote:2. glxgears reports very poor performance: about 60FPS. The software manufacterer says their product needs around 2000FPS to run smoothly
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
301 frames in 5.0 seconds = 60.157 FPSWhy you get 60FPS? you have your answer there.
I noticed that and was wondering about it. I am not familiar with glxgear at all and I am not sure about the exact meaning of the statement about framerate. But one of the developers of the software I am trying to run sent me this glxgear output, obtained on a recent Ubuntu system running under VMWare (on a Windows host, I presume):
$ glxgears
11030 frames in 5.0 seconds = 2205.791 FPS
11492 frames in 5.0 seconds = 2298.289 FPS
What gives? Are they running a different version of glxgears or is there some hidden switch in glxgears that unties it from monitor refresh rate and makes it work as a benchmark tool (man glxgears did not report any)?
Offline
chris_l is right. Turn off Sync to VBlank in nvidia-settings -> OpenGL Settings to "fix" it.
AH! I didn't know that. Now I have this:
[stefano@polus] glxgears
25700 frames in 5.0 seconds = 5139.955 FPS
26559 frames in 5.0 seconds = 5311.791 FPS
Way better. Unfortunately the problem still exists. I am gong to try turning off Xinerama next
S.
Offline
Well,
switching to Twin View did nto improve things. Performance is still abysmal.
I am starting to think it may not be my problem but theirs. Can anyone suggest a 3d GL intensive package to try out, so I can rule out that suspicion?
Stefano
Offline
BTW, when you turn on "Sync to VBlank", your framerate is "capped" to the vertical refresh. But you probably wont notice a difference from disabling it (well, maybe now you have tearing ^_^ )
Googling your laptop model, it seems to include either an nvidia quadro or a intel card. Those seem to be very different than your pc geforce 7300 card. Quadro cards tend to have better performance for this kind of jobs.
Have you tried using nouveau? Normally nvidia propetary drivers work better, but still, if the other things fail, try it.
"open source is about choice"
No.
Open source is about opening the source code complying with this conditions, period. The ability to choose among several packages is just a nice side effect.
Offline
BTW, when you turn on "Sync to VBlank", your framerate is "capped" to the vertical refresh. But you probably wont notice a difference from disabling it (well, maybe now you have tearing ^_^ )
Indeed, there is no difference :-(
Googling your laptop model, it seems to include either an nvidia quadro or a intel card. Those seem to be very different than your pc geforce 7300 card. Quadro cards tend to have better performance for this kind of jobs.
I have the model with the integrated Intel video controller, which should be worse (according to the software manufacterer, that is).
Have you tried using nouveau? Normally nvidia propetary drivers work better, but still, if the other things fail, try it.
Will try next.
Offline
Update:
nouveau is not satisfactory. I have conclusively determined, however, that this is Arch's problem not the hardware's.
I tried:
1. Booting up my machine with a Kubuntu 12.10 64-bits liveCD . SW runs fine
2. Running SW within a virtualbox VM running Kubuntu 12.04 32 bits (which I happened to have available). SW runs fine there too.
So it's somethig weird Arch does. OR, more likely, something weird that happens because---at the low graphic level---the SW expect Arch to do things the Ubuntu way and Arch's way is not Ubuntu's....
Nonetheless, I am out of options.
If anyone has suggestions, please go ahead.
Cheers,
Stefano
Offline
Which nvidia driver version are you using? Maybe try an older one like nvidia-304xx from the repos or one from the AUR.
Offline
Which nvidia driver version are you using? Maybe try an older one like nvidia-304xx from the repos or one from the AUR.
He is already using that one. But maybe one older than that may help.
"open source is about choice"
No.
Open source is about opening the source code complying with this conditions, period. The ability to choose among several packages is just a nice side effect.
Offline