You are not logged in.
Excuse me if this is a stupid question, but what are the consequences of a new toolchain? I mean I understand on a very basic level that the packagesource has to be built using GCC and GLIBC into binary packages and that compilerflags can help optimize a package for specific hardwareneeds, but how can a package become better/more stable by an elaboration of GCC and/or GLIBC?
If new features are introduced in the toolchain, doesn't a package need to be prepared to use these new functions? And if bugs axist in the current toolchain, shouldn't we all suffer from (unexplicable) instability?
Again, I probably just don't know enough about programming, but I'm interested in learning the 'how' of packages and linux in general.
Zl.
Offline
I'm, myself, curious what are the new greatest updates to glibc/gcc? I checked their website, but could not find anything so important. Maybe somebody can point them to me.
Last edited by voodoo.child (2007-11-16 17:32:15)
--
Alexandru
Offline
Glibc changes are not always something that visible to the user but there are usually big changes under the hood. Most packages depend heavily on glibc so it's always a good idea to rebuild the base system with each glibc update. All distributions from suse, fedora, debian and ubuntu do that. Rebuilding against new gcc versions usually yields better performance and stability.
Another thing is that this sort of operation help the developers pick up packages that don't compile anymore and fix them along with their PKGBUILDs.
Last edited by hussam (2007-11-16 18:11:54)
Offline
To use the new hashstyle (or other new goodies) with aur 'n abs, do I have to define some LDDFLAGS in my makepkg.conf or is it used automatically?
Offline