- Movie: "Ghost Rider", MPEG-4, 720 x 480, two channel audio, 1.2 GiB size
- Song: "Son of Man", sung by Phil Collins from the Disney movie "Tarzan", MP3 320 KiB sampling
- Europa, a four-year-old 32-bit single core Athlon XP 3200+ system with an ATI 9700 Pro and 1GiB of DRAM running openSuse 10.2, and
- Rhea, a three-year-old 32-bit single core Athlon XP 2500+ system with an nVidia 7600 video card and 512 MiB of DRAM running Ubuntu 7.04.
Again, I stress that this is no "real" test. No hard numbers (specifically throughput) were captured, nor was the experiment repeated multiple times. I simply ran the Gnome System Monitor at some point well after the movie started and watched the processor and network performance while everything was running. You'll note that somewhere between 40% and 50% of europa's processor is being used. I noticed that playing the MP3 had little or no noticeable effect, at least by observing System Monitor. Nor did the MP3 have any effect on the movie; no frames dropped, no stutter, or any audio distortions. It seemed to be pretty much all due to video playback. Looking at System Monitor and just playing the MP3 showed such a low usage that it was indistinguishable from an idle system.
These aren't state of the art systems by any stretch any more, and there is no way either could run Vista. But with contemporary Linux distributions, they are more than capable of soldiering on. Here is a screen capture System Monitor from rhea.
You can see at the bottom how rhea was pumping out the movie across its network connection until I stopped the movie. But note the processor utilization at the top. Processor utilization was nearly flat at 6% the entire time.
For development or consumer use, of what use is Windows any longer?