You are not logged in.
Hello.
I've got arch64 system with arch32 chroot in it and ATI Radeon X1270 (RS690) on board. I've compiled both 32-bit and 64-bit gallium r300g driver from Mesa 7.9 with parameters " --prefix=/usr --enable-gallium-radeon --with-dri-driverdir=/usr/lib/xorg/modules/dri --with-dri-drivers="radeon,r300" --with-state-trackers="glx,dri" ".
After that i tested them on glxgears and found, that 64-bit version gives me only 200fps, while 32-bit version - 860fps. Similar result was in tremulous. I tried to disable assembler optimizations in 32-bit version (--disable-asm), but that reduced fps in glxgears only to 800fps.
So my question: is this performance "normal" for 64-bit r300g (i.e. it's just very badly optimized compared to 32-bit) or it's something wrong on my system?
I tried to search for information on subject, but found nothing. (maybe i was looking in wrong place, so sorry in advance)
P.S. classic r300 64bit driver gave me 700fps in glxgears.
Offline
here is fine.. i am also using r300g driver.
./autogen.sh --prefix=/usr \
--with-dri-driverdir=/usr/lib/xorg/modules/dri \
--with-dri-drivers=no \
--disable-egl \
--enable-gallium --enable-gallium-radeon --enable-gallium-r600 \
--enable-glx-tls \
--with-driver=dri \
--enable-xcb \
--with-state-trackers=dri,glx \
--disable-glut
Offline
This configuration options didn't help me, unfortunately.
Last edited by Drill (2010-11-02 07:50:16)
Offline