You are not logged in.
Age old Question, does compiling from source result in a faster system? Well linux magazine have put gentoo to the test. This makes an interesting read. I'll let you draw your own conclusions before I give mine
Enjoy
Offline
My conclusion? Compiling everything from source may result in a faster system, but in my opinion the difference in speed is negligible.
Besides, compile time optimizations and code optimizations give almost no improvement compared to algorithm optimizations. Also, I believe the reason Arch Linux feels so much faster to me than Ubuntu did is because it doesn't start programs that I don't need.
Offline
Arch linux is the fastest, there is nothing there that says it isn't!
Offline
There was a time when compiling for your specific hardware could make a difference. But not today - unless you're talking about underpowered systems.
Got Leenucks? :: Arch: Power in simplicity :: Get Counted! Registered Linux User #392717 :: Blog thingy
Offline
There was a time when compiling for your specific hardware could make a difference. But not today - unless you're talking about underpowered systems.
This is still true for some apps on 32-bit systems. Having SSE2/3 or not makes a big difference for some applications. Modern 64-bit chips support roughly all the same features though, so it's usually negligible on 64-bit.
Offline
I think if you are doing some serious number crunching, or have server farms making movies, or someone like nasa running some serious serious calculations in virtual environments etc. Then maybe you would see some significant gains. But for us the most we might do is change one video format to another, the gains of maybe a minuite or so are really not going to outweigh the disadvantage of time wasted compiling.
Gentoo's real advantage of an all compiled system in my opinion is gentoo's use flags by building what is needed and keeping those dependencies down not on optimisation of code, despite this I still prefer ready binarys (Arch). If I really think an odd package has unessasary dependencies I can use ABS
I also agree with drcouzelis, Arch Linux's strong point is in picking what process/daemons we have running and not wasting our cpu cycles on what we consider we do not need (i.e. a bluetooth service we will never ever use)
Offline
For most of us we loose more time optimizing than just get the work done with whatever flags we're using.
Offline
That study only compares -Os -O2 and -O3. Most general purpose distros will use -O2. It is of more interest whether compiling specifically for your processor gives a speed boost.
Offline
As tavianator said above, the -O flags are not the only (bene)factors, especially in 32bit. Back in 2006, when I was still using Gentoo (actually, 2 months before switching to Arch ), I sneaked into the ubuntu forums, in a thread about "i686 optimized ubuntu", and posted a makeshift benchmark on 'dcraw' which showed with adequate clarity how different flags affect executables' speed. Of cource, 'dcraw' was selected for its potential to benefit from special instructions as optimizing, say, 'ls' to death wouldn't do any good.
For those interested the post is still there, just search for 'dcraw' in http://ubuntuforums.org/archive/index.php/t-26706.html
Last edited by nous (2010-03-27 19:27:35)
Offline
Age old Question, does compiling from source result in a faster system
No.
But, if it's fun for you, then go for it.
Offline
Compiling is faster but it's hell even with a package manager.
Personally, I'd rather be back in Hobbiton.
Offline
I've used gentoo for a very long time, and just changed over to Arch on my laptop
The speed difference is barely noticeable and not worth it
There may be a difference when doing CPU intensive tasks, but for generic it doesn't matter
The USE flags in gentoo is useful
Offline
Agreed and almost the exact same situation as chrisb. Ran Gentoo with every optimization I could think. Got Gnome installed and running good on a 300MHz computer and could use it fine, is it faster... yeah, the difference is small (at least on that computer). Would you save more time using Arch, rather than compiling, researching optimizations...? more than likely. It was a heck of a lot of fun though.
Setting Up a Scripting Environment | Proud donor to wikipedia - link
Offline
Also, does any performance gain get clobbered by the amount of CPU time spent compiling? I would imagine definitely yes, unless all of the compiling is done while you're asleep.
But yes, Gentoo is cool and fun to try at least once (as is LFS), even though for productive use I'd much rather use Arch. I remember many years ago using Gentoo on my old computer I set KDE compiling in the morning, left the house, came back at night expecting to be able to use the thing... it was still compiling. Bleh. It would probably be better now with faster CPUs and since g++ got a lot faster, but still... binary packages FTW, if you ask me
Offline
Compile time has improved a lot over the years, even with the increasing complex packages. Compiling kde4 on a recent cpu is quicker then kde3 on a cpu when it was relased
Offline
Even if you saved 2 hours in speed (from not waiting), you would lose about 48 hours in compiling time
Last edited by cesura (2010-03-30 01:53:50)
Offline
Even if you saved 2 hours in speed (from not waiting), you would lose about 48 hours in compiling time
That presumes that all compiling is done with the user staring at the screen and not doing anything .
My custom kernels compile in the background while I work ('work' being a loose term which includes playing Freecell), hence the only 'penalty' in terms of time is the time I spend looking through the config options to make sure I haven't inadvertantly turned off SATA support.
The big time commitment is initially setting things up (especially with a kernel config), from then on its not too much, and more than justified by the reduction in "oh crap I have to reboot" moments, not so much the 'less waiting'.
Allan-Volunteer on the (topic being discussed) mailn lists. You never get the people who matters attention on the forums.
jasonwryan-Installing Arch is a measure of your literacy. Maintaining Arch is a measure of your diligence. Contributing to Arch is a measure of your competence.
Griemak-Bleeding edge, not bleeding flat. Edge denotes falls will occur from time to time. Bring your own parachute.
Offline
IMO, after using Gentoo for some time, I don't think that compiling from source gives a significant speed boost in terms of performance in the user environment -- that is, regular user interaction with the computer is not significantly snappier (though some users might claim it is due to the placebo effect). However, I would say that by correctly configuring packages during compile time, you can get fairly significant performance improvements depending on the program. This is done through Gentoo's USE flags. It's not always effective, but I dimly remember a few cases where keeping things minimalistic with the USE flags really improved system performance.
But at the end of the day, Gentoo was just a toy for me and none of the minor performance gains outweighed the long compile times (the computer that I ran it on had an Athlon XP 3000 and compilation did usually waste my time because it's a single-core processor). Furthermore, there was the frustration of things usually breaking, which I won't get into because that's not part of this topic.
This article, like many other articles of its kind, suffers from a problem. When you just run a performance test, make a pretty graph, and write "well this was 4% faster on X than on Y," you aren't giving the full story. If you do 1000 tests and show that this is perfectly repeatable and apply statistical methods to tell you not only how much the performance increased, but also how significant it was, then you have given some really useful data and analysis. Most of those results seemed far too close to make any definitive statements, but that's impossible to know until you run tests a few thousand times and look at the distribution of the results.
-- jwc
http://jwcxz.com/ | blog
dotman - manage your dotfiles across multiple environments
icsy - an alarm for powernappers
Offline
Also, does any performance gain get clobbered by the amount of CPU time spent compiling? I would imagine definitely yes, unless all of the compiling is done while you're asleep..
I've heard the 'it compiles while I sleep' argument before, but it seems an odd position to me. Binary is so much more convenient; upgrades occur in moments and installations in seconds. No waiting until tomorrow, like as if I want to simmer a heavenly chicken stew and eat it the second day when the flavors awaken.
Offline
itsbrad212 wrote:Even if you saved 2 hours in speed (from not waiting), you would lose about 48 hours in compiling time
That presumes that all compiling is done with the user staring at the screen and not doing anything
.
My custom kernels compile in the background while I work ('work' being a loose term which includes playing Freecell), hence the only 'penalty' in terms of time is the time I spend looking through the config options to make sure I haven't inadvertantly turned off SATA support.
The big time commitment is initially setting things up (especially with a kernel config), from then on its not too much, and more than justified by the reduction in "oh crap I have to reboot" moments, not so much the 'less waiting'.
I usually browse the web when I am compiling things (I.E. Linux from scratch). However, everything is slow because compiling it takes up about 97% percent of my CPU power, and it is always maxed out. (I'm actually compiling gcc as we speak). So, theoretically speaking, you would lose time in how long it would take to process those activities, and you'd probably stab yourself before that anyway once you realize that firefox takes 20 seconds to load a page
Offline
Bralkein wrote:Also, does any performance gain get clobbered by the amount of CPU time spent compiling? I would imagine definitely yes, unless all of the compiling is done while you're asleep..
I've heard the 'it compiles while I sleep' argument before, but it seems an odd position to me. Binary is so much more convenient; upgrades occur in moments and installations in seconds. No waiting until tomorrow, like as if I want to simmer a heavenly chicken stew and eat it the second day when the flavors awaken.
Hahaha! I suggest we henceforth refer to this as "The Chicken Stew Fallacy in OS Packaging Theory"
Offline
Even if you saved 2 hours in speed (from not waiting), you would lose about 48 hours in compiling time
Correct (plus you get bonus points for being able to use English correctly - "lose" instead of "loose").
I compile from source, not because the resulting executables are faster, but because I'm a CONTROL FREAK, and understand that distro packagers are kiddie idiots doing it solely to try to impress chicks.
Offline