You are not logged in.
Hy everybody,
I have a question, not sure this is the good place to ask, sorry if I am mistaking...
Currently, as far as I understood, the R300 driver in extra is the gallium version (R300g), while the R600 driver is still the classic one (R600c).
Does somebody know when mesa will be compiled by default in extra with R600g instead of R600c ?
I am not asking of course some precise date, but something like "not before a long time", "expected in a few months" or "soon". :-)
Thank you.
EDIT: oupss, a glxinfo | grep opengl gives me this on my R700 card:
OpenGL vendor string: X.Org
OpenGL renderer string: Gallium 0.4 on AMD RV730
OpenGL version string: 2.1 Mesa 7.10.2
OpenGL shading language version string: 1.20
So we actually already are under Gallium for R600+...
Last edited by Laserpithium (2011-04-26 18:38:17)
Offline
I've been running Arch since January this year, and I'm still a Linux n00b. I've had 4 kernel panics in the last week (never had one before this), and I believe they've all been pretty much the same. After leaving my laptop running for a while (so Xscreensaver was on), I come back to the panic screen. The panic screen mentions the ati driver, so I thought I'd start here. I've got the latest kernel 2.6.38 running on an HP 8740w, i7-840QM, ATI Mobility Radeon HD 5800 Series, 10gb ram, and using the open source driver, also running Gnome and Xscreensaver.
I took pictures of the panic screen, but they are not readable at the resolution specified in the Forum Etiquette thread. I don't know what the preferred protocol is in this case.
I thought of disabling Xscreensaver, a colleague suggested trying the ATI driver instead of the open source driver. Anybody have any ideas of other things I can look at/try?
thx,
-Brad
Offline
@BAJ716: I'm running a similar machine to you, an asus G73JH-a2, albiet with a lesser processor and 2 gb less of ram, and i'd guess that xscreensaver is your problem. I've never had a kernel panic with the open source drivers and I'm running 3 monitors. You could also check the relevant logs in /var/log, like "messages" or "xorg.log".
Offline
Thanks Dizzy1. I can try disabling xscreensaver for a while and see what happens.
More info, it happened again today while I was at lunch. I went to lunch about noon and I locked my screen before I left. My Xscreensaver time settings are Random Mode, Blank after 40min, Cycle after 10min, Lock screen after 20min.
Here's the relevant entries from my /var/log/messages.log
http://pastebin.com/12FhmkLj
Offline
Just a small question:
Does it still (or now) make sense to use this repo for an hd 3470?
Speed, Power Management, Open GL extensions... ?!
Thanks for any answer...
Offline
Hi Perry,
do you plan to add "--enable-texture-float" flag to mesa PKGBUILD ?
Offline
I'd try to compile mesa-full-gallium,
configure: error: LLVM is required to build Gallium R300 on x86 and x86_64
==> ERREUR: Une erreur s'est produite dans build().
Abandon...
so, now llvm is required to compile,can you add it in PKGBUILD depends ?
Offline
I set both variables, and glxgears/glxinfo respects this and renders ~1900fps.
But ioquake3 doesnt, I can see that after launch of "demo four" the framerate jumps to ~140fps for some seconds, drops to 60fps, jumpgs to ~140fps and drops to 60fps and stays there.
I got 64fps as final result from "demo four", not 60fps, not 140 fps (my regular result)[peter@cupcake ~]$ echo $vblank_mode 0 [peter@cupcake ~]$ echo $CLUTTER_VBLANK none [peter@cupcake ~]$ glxinfo | grep -i opengl ATTENTION: default value of option vblank_mode overridden by environment. ATTENTION: default value of option vblank_mode overridden by environment. OpenGL vendor string: X.Org OpenGL renderer string: Gallium 0.4 on AMD REDWOOD OpenGL version string: 2.1 Mesa 7.10.1 OpenGL shading language version string: 1.20 OpenGL extensions: ATTENTION: default value of option vblank_mode overridden by environment. ATTENTION: default value of option vblank_mode overridden by environment. [peter@cupcake ~]$ glxgears ATTENTION: default value of option vblank_mode overridden by environment. ATTENTION: default value of option vblank_mode overridden by environment. 8474 frames in 5.0 seconds = 1694.693 FPS 8367 frames in 5.0 seconds = 1671.786 FPS
Looks right. But framerate in IOQuake3 jumps from ~140 to ~60 an stays there
I tested now Gnome3 in fallback-mode, with Metacity&Compositing. So VSYNC is complete off: IOQuake3 is stable above 60 FPS, average 160 FPS!
With Gnome3 and Clutter/Mutter VSYNC setting is not "fully respected". VSYNC seems to be partially disabled for "seconds", average 69FPS. Most time fixed to 60fps.
Can anyone confirm? I will open a Bug upstream at Gnome-Bugzilla, because is seems not to be a bug by the driver.
Last edited by hoschi (2011-05-07 11:54:03)
Offline
Hi,
I am planning to buy a new graphics card. I am thinking about HD6450 (I don't need extra performance). Do you know what is the current support of xf86-video-ati for RV910 chips (HD6450)?
Thanks
Offline
Hi Perry,
do you plan to add "--enable-texture-float" flag to mesa PKGBUILD ?
I thought about it. But i don't know if it is legal to enable it for my repo.
Even on phoronix i didn't get a clear answer.
Offline
Is this one of this "software-patent" issues? The host-server is located in germany?
Well, by european and german law, software-patents are forbidden.
To be honest:
Of course EPA violates this itself from time to time and divide in software-patents with technical invention, and simple software-patents (never granted). This first one are sometimes granted, sometimes not, sometimes declared invalid by justice. Changing with political opportunity and the the desk of responsible patent examiner.
Offline
https://bugzilla.gnome.org/show_bug.cgi?id=650096
VSYNC-OFF not working
Offline
Is the game amnesia work with yesterday update of the radeon repo? I've an ati 4850 HD and it use gallium 0.4. Is their any website about gallium and the videogame compatibility?
Offline
Huh, heh. I was just about to ask why the last packages @http://spiralinear.org/perry3d/x86_64/ were last made on April 25... Today there's a 05/12 build for most...
Offline
Good new amnesia seems to work ^^ http://wiki.x.org/wiki/RadeonProgram
Offline
Hey,
sorry if it's been discussed already but I cannot find it.
Anyone has been able to use XVMC with the gallium driver already?
Thanks!
Offline
Did someone arrived to launch amnesia? Here http://wiki.x.org/wiki/RadeonProgram it says that it worked perfectly (platinium) with mesa 7.11-dev and libtxc_dxtn, I've mesa-full from this package and installed libtxc_dxtn but it doesn't work at all! Please help me.
Edit: libtxc_dxtn is installed, it's enable in driconf but doen't seems to work, do directories change with the update? How to make libtxc_dxtn works?
Last edited by splashy (2011-05-15 16:51:05)
Offline
Hi!
I'm on a hd5450 using xf86-video-ati(-git), where gallium is default currently, and allthough I don't use any 3d whatsoever since I dont play games or use compositing etc., then I was still curious about getting to know the glxgears fps value of my card with the foss drivers...
So, I then runned 'vblank_mode=0 glxgears' (or export the env first), then glxgears also states that vblank env is being overridden, but I still just get 60fps like with vsync enabled!
I have a 23" inch screen running 1920x1080, and if I then with my manual tilling wm, split the screen into 4 equally big parts, and then run 'vblank_mode=0 glxgears' in one of those smaller frames, then I get 300fps.
I then thought that my card couldn't support more than 60fps on 23" inch fullscreen 1920x1080, but however, I then read the following quote elsewhere:
"It does work as an env so vblank_mode=0 ./gears works - but I still get wait
for vline sync which means that fullscreen games or gears maximised gets capped
to refresh."
So could you guys please tell me if you kow what this "wait for vline sync" is additionally to standard vsync and how to disable it momentarilly, just for testing.
I have tried different ~/.drirc's which disables vblank to no avail also...
Thanks in advance.
Edit: [SOLVED]
Solution from Alex Deucher:
https://bugs.freedesktop.org/show_bug.cgi?id=...
--- Comment #31 from Alex Deucher <agd...@yahoo.com> 2011-02-21 13:51:51 PST ---
vline waits for non-pageflipped swap buffers can be disabled in both radeon and
intel with:
Option "SwapbuffersWait" "False"
in the device section of your xorg.conf
Source: http://web.archiveorange.com/archive/v/ … tNYFsp6qMR
Btw, for some background info on the above setting, then here's an exstract from the commit message:
From 28453a25c7dd02b1591051c26c1c063c9f1928f4 Mon Sep 17 00:00:00 2001
From: Mario Kleiner <mario.kleiner@tuebingen.mpg.de>
Date: Mon, 22 Nov 2010 04:11:07 +0100
Subject: [PATCH 3/4] ddx/ati: Add option "SwapbuffersWait" to control vsync of DRI2 swaps.
A new optional kms driver option "SwapbuffersWait" is defined
for xorg.conf, which defaults to "on". If "on", DRI2 bufferswaps
will be synchronized to vsync, otherwise not.
This currently only affects copy-swaps, not pageflipped swaps.
It also requires a swap_interval setting of zero by the OpenGL
client.
Ideally, we'd provide a way for dri2 to pass the current swap
interval to the ddx so we could change this dynamically.
Signed-off-by: Mario Kleiner <mario.kleiner@tuebingen.mpg.de>
---
man/radeon.man | 11 +++++++++++
src/radeon.h | 6 +++++-
src/radeon_dri2.c | 4 +++-
src/radeon_kms.c | 6 ++++++
4 files changed, 25 insertions(+), 2 deletions(-)
diff --git a/man/radeon.man b/man/radeon.man
index f442532..485c393 100644
--- a/man/radeon.man
+++ b/man/radeon.man
@@ -310,6 +310,17 @@ Color tiling will be automatically disabled in interlaced or doublescan screen m
.br
The default value is
.B on.
+.TP
+.BI "Option \*qSwapbuffersWait\*q \*q" boolean \*q
+This option controls the behavior of glXSwapBuffers and glXCopySubBufferMESA
+calls by GL applications. If enabled, the calls will avoid tearing by making
+sure the display scanline is outside of the area to be copied before the copy
+occurs. If disabled, no scanline synchronization is performed, meaning tearing
+will likely occur. Note that when enabled, this option can adversely affect
+the framerate of applications that render frames at less than refresh rate.
+.IP
+The default value is
+.B on.
Source: http://people.freedesktop.org/~agd5f/pf … ync-.patch
Last edited by mhertz (2011-05-16 22:33:05)
Offline
Did someone arrived to launch amnesia? Here http://wiki.x.org/wiki/RadeonProgram it says that it worked perfectly (platinium) with mesa 7.11-dev and libtxc_dxtn, I've mesa-full from this package and installed libtxc_dxtn but it doesn't work at all! Please help me.
Did you follow the instructions for the Gallium driver in the first post? Amnesia works pretty well for me using r300g. Error messages/more details would be helpful.
You can check if libtxc_dxtn installed correctly with:
glxinfo | grep s3tc
Offline
Edit: [SOLVED]
Solution from Alex Deucher:
https://bugs.freedesktop.org/show_bug.cgi?id=...
--- Comment #31 from Alex Deucher <agd...@yahoo.com> 2011-02-21 13:51:51 PST ---
vline waits for non-pageflipped swap buffers can be disabled in both radeon and
intel with:
Option "SwapbuffersWait" "False"
in the device section of your xorg.confSource: http://web.archiveorange.com/archive/v/ … tNYFsp6qMR
Thank you so much!
So much!
/etc/X11/xorg.conf.d/10-vsync.conf
Section "Device"
Identifier "Radeon 5650" // change to your card
Option "SwapbuffersWait" "False"
EndSection
Togehter with:
/etc/profile.d/radeon.sh
# export LIBGL_DRIVERS_PATH=/usr/lib/xorg/modules/dri_g/
# export R600_ENABLE_S3TC=1
export vblank_mode=0
export CLUTTER_VBLANK=none
Gives me 140 fps per second in IOQuake3, stable!
The big question beyond all:
Why is this NOT necessary with Metacity (Compositor ON) in Gnome2 and Gnome3?
Offline
Now s3tc works with adding export R600_ENABLE_S3TC=1 in the radepn.sh file but in amnesia now it blocks on the screen loading... But now s3tc works and I listen the sound of the menu but still with a fix screen of the loading page with any video settings.
Offline
Ha after installing kernel26-drm-radeon-testing add it in grub and fix an issue in Amnesia that made my saves impossible to load... (I create the file /etc/openal/alsoft.conf and put in drivers=pulse,alsa). The problem was that I didn't use pulseaudio,(and still not using this èçà__#$$) but amnesia continuing with openal to search it. Now amnesia is working very well, settings are to max it works as the propriety driver.
their is some glitches on the lights (their are some white circles sometimes on it). I know it's a detail but for Amnesia, a horror game it's very important for the immersion so did you guys encounter it and how to fix it? Are their still her with kernel 2.6.39(who have to be more conveniant for st3c)?
Offline
Thank you so much!
So much!
You're most welcome, mate!
The big question beyond all:
Why is this NOT necessary with Metacity (Compositor ON) in Gnome2 and Gnome3?
Here's a quote from xorg ati driver developer agd5f, which I believe explains it: (in 4'th paragraph, I just quoted his whole wording!)
Another thing to keep in mind is that the GLX "vsync" extensions have nothing to do with tearing. They merely provide synchronization events which apps can synchronize with to render at a fixed frame rate. To avoid tearing (updating the visible part of the framebuffer while scanout is running) you need to wait for the vertical blanking period to copy new content to the visible area, or wait until the current vline (currently refreshing vertical line) is past the area of the screen that you want to update.
The GLX "vsync" extensions are supported via mesa and the drm. They are implemented via vertical blank interrupts and frame counters. These provide the events apps can synchronize with.
To avoid tearing, we wait for the current vline to get past the rendering area before writing new data to the front buffer. This is done for both GL buffer swaps and Xv (when rendering to the visible framebuffer).
If you are using a compositing manager, Xv windows are redirected to an offscreen buffer so they do not wait for the vline before rendering. The compositing manager copies the content to the visible buffer when it composites the front buffer. As such, it might still tear unless that copy waits for vline as well. When the compositing manager uses GL, you get this automatically (as per the above paragraph). When it does not use GL, you can still get tearing. In the radeon driver, you can force all operations that touch the visible buffer to wait on the vline using the EXAVSync option, however, this has a noticeable impact on performance as you spend a lot of time waiting.
To make this all work, you need support in all 3 pieces of the stack (ddx, mesa, drm).
Source: http://phoronix.com/forums/showthread.p … post141645
Offline
I got ~200 fps speed increase in the last upgrade on both the classic & Gallium stacks (slightly less on the Gallium).
That's a first for months.
R600 series.
I used to be surprised that I was still surprised by my own stupidity, finding it strangely refreshing.
Well, now I don't find it refreshing.
I'm over it!
Offline
Has anybody tried .39 , or 3.0 kernel? Did powersaving improved ? Main reason why i use catalyst is because i can't get no satisfaction with OSS drivers, regarding powersaving. (Tried every method!) It's a mobility 3650 HD, if anyone has positive experiences , please share
With catalyst my GPU clock is at 110MHz 90% of the time ,memory clock is at 400MHz , but with radeon they are both at the highest values all the time. I know it's a known issue , but i must ask.
Offline