You are not logged in.
Hi,
I'm currently having some problems with the mplayer package. To put it
precisely: it completely freezes my system and I have no other option
than shutting down the computer. So I compiled the same version from
source, without any optimizations and it runs just fine.
$ mplayer --version (this is the Arch-version)
MPlayer 1.0pre7-3.4.3 (C) 2000-2005 MPlayer Team
CPU: Intel Celeron 2/Pentium III Tualatin (Family: 6, Stepping: 1)
Detected cache-line size is 32 bytes
CPUflags: MMX: 1 MMX2: 1 3DNow: 0 3DNow2: 0 SSE: 1 SSE2: 0
Compiled with runtime CPU detection - WARNING - this is not optimal!
To get best performance, recompile MPlayer with --disable-runtime-cpudetection.
Why all these optimizations? Why not listen to the Mplayer developer's
explicit instructions?
Cheers
Offline
What are you talking about? It's not optimized, in fact, it's using runtime CPU detection to detect your CPU's capabilities.
Here's mine, using the same stock Arch mplayer:
[karma@ailen ~]$ mplayer
MPlayer 1.0pre7-3.4.3 (C) 2000-2005 MPlayer Team
CPU: Advanced Micro Devices (Family: 8, Stepping: 0)
Detected cache-line size is 64 bytes
CPUflags: MMX: 1 MMX2: 1 3DNow: 1 3DNow2: 1 SSE: 1 SSE2: 1
Compiled with runtime CPU detection - WARNING - this is not optimal!
To get best performance, recompile MPlayer with --disable-runtime-cpudetection.
Ailen:
Kernel: Linux 2.6.14-rc4-ck1 #1 PREEMPT
Built on: Mon Oct 17 14:51:37 CEST 2005
Hardware: Mobile AMD Sempron(tm) Processor 2800+ AuthenticAMD
WM: E17 snapshot 20051016
Offline
AFAIU the runtime CPU detection has been enabled explicitly for the Arch package, as my own source compilation doesn't show this WARNING:
$ mymplayer --version (compiled from source)
MPlayer 1.0pre7-3.4.3 (C) 2000-2005 MPlayer Team
CPU: Intel Celeron 2/Pentium III Tualatin (Family: 6, Stepping: 1)
Detected cache-line size is 32 bytes
CPUflags: MMX: 1 MMX2: 1 3DNow: 0 3DNow2: 0 SSE: 1 SSE2: 0
Compiled for x86 CPU with extensions: MMX MMX2 SSE
I only did
./configure --enable-gui --with-codecsdir=/usr/lib/win32/ --prefix=/home/joe/mybin/mplayer/
Offline
Maybe it has something to do with my specific CPU?
$ cat /proc/cpuinfo
processor : 0
vendor_id : GenuineIntel
cpu family : 6
model : 11
model name : Intel(R) Celeron(TM) CPU 1133MHz
stepping : 1
cpu MHz : 1130.036
cache size : 256 KB
physical id : 0
siblings : 1
fdiv_bug : no
hlt_bug : no
f00f_bug : no
coma_bug : no
fpu : yes
fpu_exception : yes
cpuid level : 2
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 sep mtrr pge mca cmov pat pse36 mmx fxsr sse
bogomips : 2236.41
However, my question is: Why is it necessary to enable specific
configure flags for a generic package, i.e.: a package that will be
used by thousands of users? If a user has special needs, he'll
probably make his own package. I'd suggest to stick with the configure
script of the original developer.
Offline
The mplayer package is compiled with these options:
./configure --prefix=/usr --enable-gui --enable-linux-devfs --disable-arts
--enable-runtime-cpudetection --disable-smb --enable-sdl --enable-x11
--enable-theora --with-win32libdir=/usr/lib/win32 --confdir=/etc/mplayer
--enable-external-faad --enable-tv-v4l --enable-tv-v4l2
The --enable-runtime-cpudetection flag is used to check what kind of i686 cpu you have in order to load the correct cpuflags, for example the 3DNow flags are for the AMD cpu only. It is needed because users have different i686 cpu.
Offline
However, my question is: Why is it necessary to enable specific
configure flags for a generic package, i.e.: a package that will be
used by thousands of users? If a user has special needs, he'll
probably make his own package. I'd suggest to stick with the configure
script of the original developer.
The Mplayer configuration script detects your CPU and enables CPU specific optimizations by default. If you compile Mplayer without --enable-runtime-cpudetection, the resulting executable will run only on machines that have exactly the same CPU as you do. Of course, this is not a problem if you intend to run the binary only on your machine, but creating a package that will run on different architectures (Intel vs. AMD in the Arch case) requires the usage of the --enable-runtime-cpudetection flag. The price to pay is a small loss of performance.
Offline
The --enable-runtime-cpudetection flag is used to check what kind of i686 cpu you have in order to load the correct cpuflags, for example the 3DNow flags are for the AMD cpu only. It is needed because users have different i686 cpu.
Why should it be necessary to detect the CPU at "runtime"? Why not at compile time?
Offline
If you compile Mplayer without --enable-runtime-cpudetection, the resulting executable will run only on machines that have exactly the same CPU as you do.
I hope, for your own sake, that you know that you're talking *bull* )
Have a nice day
Offline
Because it's either do runtime CPU detection or generic optimizations. Unlike most apps, media apps may gain some performance:clock cycle boost from optimization, so detecting at runtime is the best way unless you plan to package a different version for each processor type.
In reference to your problem: many people have used this package without trouble. If you managed to fix the issue by building the package yourself then I don't see the problem.
edit: fixed a typo :oops:
Offline
I hope, for your own sake, that you know that you're talking *bull* )
Have a nice day
He's right. If I was the packager of MPlayer, and I compiled the package on my AMD laptop, it would use every optimization as I copy-pasted earlier. Your celeron couldn't run it because of the missing instructions.
iBertus is right as well. If it works after rebuilding, case closed
Ailen:
Kernel: Linux 2.6.14-rc4-ck1 #1 PREEMPT
Built on: Mon Oct 17 14:51:37 CEST 2005
Hardware: Mobile AMD Sempron(tm) Processor 2800+ AuthenticAMD
WM: E17 snapshot 20051016
Offline
Because it's either do runtime CPU detection of generic optimizations.
Sorry, could not parse that.
Unlike most apps, media apps may gain some performance:clock cycle boost from optimization, so detecting at runtime is the best way unless you plan to package a different version for each processor type.
Not trying to sound like a multimedia-expert (which I'm definitely
not), but: what makes you believe that runtime-cpu-detection improves
performance?
If you managed to fix the issue by building the package yourself then I don't see the problem.
You're kiddin me, aren't you?
I install a standard Arch-app. I run it and my computer
freezes/crashes. I compile the source version of the same program and
it runs flawlessly.
And you don't see the flaw?
Additionally: When running
$ mplayer --version
the predefined warning/error-message (as defined by the original
developer of the program) explicitly states that the program has been
compiled with sub-optimal settings (--enable-runtime-cpudetection). It
says:
"Compiled with runtime CPU detection - WARNING - this is not optimal!"
How clear can it get?
BTW: I'm not accusing anybody. I'm just reporting a behaviour that
seems buggy to me.
Offline
He's right. If I was the packager of MPlayer, and I compiled the package on my AMD laptop, it would use every optimization as I copy-pasted earlier.
Well, what you really said before was:
What are you talking about? It's not optimized
:-) Get yourself together ;-)
Offline
Ok, different CPUs have different capabilities and extra instruction sets they can understand. The most common ones being: MMX, MMX2, 3DNow, 3DNow2, SSE, and SSE2, although there are more.
When a program uses these, in particular multimedia programs, it will cause a definite improvement in performance.
Arch is designed to run on anything from a Pentium 2 up. A pentium 2 does not support all these extra options.
If we compiled Mplayer with all the extra cpu options enabled, then it would not run on that Pentium 2.
If we compiled it with no extra options, it would run on anything, but not as well as it could.
So the mplayer developers created an option, so that Mplayer could detect automagically when it's run, which options it needs for the CPU on the computer it runs on.
They say it's not optimal, because it delays mplayer's startup time, and in very rare circumstances, such as yours -- which is the first time I have ever seen it -- mplayer makes a mistake in it's detection, and does not run.
All major distributions compile mplayer like this. Mandrake etc.
This is the optimal configuration for a distributed mplayer package.
If you dislike it, feel free to use ABS, as this seems to work for you
iphitus
Offline
If we compiled it with no extra options, it would run on anything, but not as well as it could.
Understood. I tend to agree. It's just that the apparent reason for
the bug (cpu-runtimedetection) doesn't seem to make too much sense
anyway. Again: I'm not saying that this *is* the bug or that this
setting is superflous per se. Maybe the maintainer will clear this up.
I've contacted him.
So the mplayer developers created an option, so that Mplayer could detect automagically when it's run, which options it needs for the CPU on the computer it runs on.
As I said before: I don't see the benefits for a generic i686 package
...
All major distributions compile mplayer like this. Mandrake etc.
Uh, well, then I'm going to have to tell you that this was the *first*
time I had to install it from source. And I'm distro-hopping just as
much as everybody else.
This is the optimal configuration for a distributed mplayer package.
Hm, that's what I wanted to figure out. (Mind you: I'm not trying to
dis the Mplayer-maintainer) As far as I understand it's up to the
maintainer of the package how it's going to be built. IOW: (worst
case scenario) some l33t kidd0 decides upon the functionality of my
system. Kinda scares me off.
Offline
As I wrote earlier, mplayer is compiled with a lot of options/features (see official PKGBUILD):
/configure --prefix=/usr --enable-gui --enable-linux-devfs --disable-arts
--enable-runtime-cpudetection --disable-smb --enable-sdl --enable-x11
--enable-theora --with-win32libdir=/usr/lib/win32 --confdir=/etc/mplayer
--enable-external-faad --enable-tv-v4l --enable-tv-v4l2
It's possible that your problem comes from something else than runtime-cpudetection.
Offline
It's possible that your problem comes from something else than runtime-cpudetection.
Thank you, Cpt. Obvious ;-)
As I stated before:
I'm not saying that this *is* the bug or that this setting is superflous per se. Maybe the maintainer will clear this up. I've contacted him.
Offline
iphitus wrote:So the mplayer developers created an option, so that Mplayer could detect automagically when it's run, which options it needs for the CPU on the computer it runs on.
As I said before: I don't see the benefits for a generic i686 package
...
I detailed the benefits, a video playing on an Pentium 4 CPU, with SSE and SSE2 in Mplayer, will play much better than Mplayer without SSE.
The two choices are, either,
to compile a package, with no benefits, that will run on all CPUs that Arch supports.
Or to compile a package that automatically detects and uses the benefits for your CPU and runs on all CPUs arch supports. Therefore someone's pentium 4 will be able to use SSE, and someones AMD will be able to use 3DNow, and my Pentium 2 will run just fine and not use any.
Arch chose the latter, it's the fairest for all.
iphitus
ps: Please calm down, calling people Cpt Obvious, and the maintainer a l33t k1dd0 will not get you anywhere.
Offline
I detailed the benefits, a video playing on an Pentium 4 CPU, with SSE and SSE2 in Mplayer, will play much better than Mplayer without SSE.
I never questioned this. But: Arch defines itself as a i686-optimized
distro. Not a P3-something-plus-tax distro. Therfore all optimizations
should work on *all* i686-based-machines, wouldn't you agree?
The two choices are, either,
Obviously these are not the only two choices ;-)
to compile a package, with no benefits, that will run on all CPUs that Arch supports. Or to compile a package that automatically detects and uses the benefits for your CPU
Detection? Why? Sounds GREAT *if* the maintainer is able to test and
backtrack with *all* (with reservations) possible i686 CPUs.
Seems to me that you just got fond of this "detection" issue simply
due to the fact that I can reproduce a bug that is (seemingly) based
on this detection feature. I've never read anything about "detection"
in the docs (I'm an Arch-Newbie, though).
However, in short: AFAIU, Arch is supposed to provide packages which
are supposed to run on *all* i686 (based) systems.
and runs on all CPUs arch supports. Therefore someone's pentium 4 will be able to use SSE, and someones AMD will be able to use 3DNow, and my Pentium 2 will run just fine and not use any.
Arch chose the latter, it's the fairest for all.
"Arch chose the latter"? What do you mean with that? That Arch settles
on the least common denominator? Obviously not. My Mobile-Celeron is
(though it's aged) fairly high on the i686-family ladder. Still, the
Mplayer package managed to break it.
ps: Please calm down, calling people Cpt Obvious, and the maintainer a l33t k1dd0 will not get you anywhere.
Ooops, I did it again.
My sincere apologies, if I disrespected someone. Honest.
My harsh sense of humor seems to turn people off.
I didn't mean it that way. Sorry, folks.
Offline
Arch is a bleeding-edge, user tested distro without the huge amount of devs and maintainers employed by the likes of Mandrake and Red Hat. We cannot test every possible chipset/CPU combo. Media players are hard on the hardware and this is really true for older hardware. I'd say that if anything would crash a system it'd be this.
Arch packages are built to run on any i686 chip--this is not the issue here. Also, nobody is saying that runtime CPU detection is without problems, it's not! It works for most people and seems to break the package for a small minority. Arch solves problems like this by having the easiest package management system in the world when it comes to custom package builds. Just run abs, change the PKGBUILD and makepkg.
It is not the intent/purpose/policy of Arch to try and be Mandrake. I'm glad you feel like making a contribution to the testing/bug reporting of Arch packages, however, you may have to accept the fact that making changes in this instance isn't in the best intrests of most Arch users.
Offline
iphitus wrote:I detailed the benefits, a video playing on an Pentium 4 CPU, with SSE and SSE2 in Mplayer, will play much better than Mplayer without SSE.
I never questioned this. But: Arch defines itself as a i686-optimized
distro. Not a P3-something-plus-tax distro. Therfore all optimizations
should work on *all* i686-based-machines, wouldn't you agree?iphitus wrote:The two choices are, either,
Obviously these are not the only two choices ;-)
iphitus wrote:to compile a package, with no benefits, that will run on all CPUs that Arch supports. Or to compile a package that automatically detects and uses the benefits for your CPU
Detection? Why? Sounds GREAT *if* the maintainer is able to test and
backtrack with *all* (with reservations) possible i686 CPUs.Seems to me that you just got fond of this "detection" issue simply
due to the fact that I can reproduce a bug that is (seemingly) based
on this detection feature. I've never read anything about "detection"
in the docs (I'm an Arch-Newbie, though).However, in short: AFAIU, Arch is supposed to provide packages which
are supposed to run on *all* i686 (based) systems.iphitus wrote:and runs on all CPUs arch supports. Therefore someone's pentium 4 will be able to use SSE, and someones AMD will be able to use 3DNow, and my Pentium 2 will run just fine and not use any.
Arch chose the latter, it's the fairest for all.
"Arch chose the latter"? What do you mean with that? That Arch settles
on the least common denominator? Obviously not. My Mobile-Celeron is
(though it's aged) fairly high on the i686-family ladder. Still, the
Mplayer package managed to break it.iphitus wrote:ps: Please calm down, calling people Cpt Obvious, and the maintainer a l33t k1dd0 will not get you anywhere.
Ooops, I did it again.
My sincere apologies, if I disrespected someone. Honest.
My harsh sense of humor seems to turn people off.
I didn't mean it that way. Sorry, folks.
I dont know whether you realise this, but this is called a bug.
It's an upstream issue in mplayer, not the arch devels fault.
The Arch packager had no knowledge of this bug. And you're the first to report it. And we still dont even know if it's the CPU detection causing the bug.
Those optimisations arent available for the older processors because they are incabable of handling SSE or 3DNow instructions. Mplayer will only use them if the CPU supports it.
I'd prefer the current situation with the detection, rather than crippling mplayer for newer computers by not enabling any.
iphitus
Offline
It's an upstream issue in mplayer, not the arch devels fault.
Sorry, I'm not native anglophone: what's an "upstream issue"?
I'd prefer the current situation with the detection, rather than crippling mplayer for newer computers by not enabling any.
Pardon? I must have misunderstood you. "current approach"? You're
still opting for the buggy approach?
BTW: "by not enabling any"? By not enabling what? What is there that
can only be enabled at runtime?
*puzzled*
Offline
There is no *bug* in the Arch package but rather an issue with Mplayer itself! The Arch package maintainer cannot fix this issue and you should report your findings to the Mplayer people. The point is that MOST people have no problem with the package and that MOST people benefit from having runtime detection as opposed to a non-optimized binary.
As has been said before, either you enable runtime detection or you optimize for each CPU or you don't optimize at all. The latter two are out of the question.
Offline
iphitus wrote:It's an upstream issue in mplayer, not the arch devels fault.
Sorry, I'm not native anglophone: what's an "upstream issue"?
Issue in mplayer, not caused by the Arch developers.
iphitus wrote:I'd prefer the current situation with the detection, rather than crippling mplayer for newer computers by not enabling any.
Pardon? I must have misunderstood you. "current approach"? You're
still opting for the buggy approach?BTW: "by not enabling any"? By not enabling what? What is there that
can only be enabled at runtime?*puzzled*
The what you say is buggy approach, is a reliable approach that has never caused any trouble for any users in the years that it's been in use. Really, stop calling it buggy, you know not what you speak of. I have explained its purpose multiple times and you still fail to understand me and what it does.
I would prefer the current situation with the detection, having it automatically detected what instructions a CPU can handle, at runtime, rather than disabling it, and crippling the build so that it does not have any extra instructions available for more powerful processors.
As has been said, you are the only one with this problem and have yet to provide credible evidence that it is even the detection causing it. So it's unlikely that the Arch package will change.
iphitus
Offline
The what you say is buggy approach, is a reliable approach that has never caused any trouble for any users in the years that it's been in use.
Whattt?!? Two postings above you've admitted that it's a bug. Are you
shittin me?
Really, stop calling it buggy,
I will not.
you know not what you speak of.
True that may be.
Telling you I will be that:
An i686 distro I'm looking for.
Binary repositories I'm insisting on.
X.org compiling I'm not willing to.
Dependencies to be resolved I'd like.
A nice day am wishing I
Offline
iphitus wrote:The what you say is buggy approach, is a reliable approach that has never caused any trouble for any users in the years that it's been in use.
Whattt?!? Two postings above you've admitted that it's a bug. Are you
shittin me?
No, i called whatever is causing your mplayer to lockup a bug, not the whole implementation.
iphitus wrote:Really, stop calling it buggy,
I will not.
Stubborn
Mods? Lock? Anyone?
iphitus
Offline