Saturday, January 28, 2006

Gmail as an inspiration for comedy


This is akin to what I felt when I found the new 'Delete' button on the Gmail header earlier this week. Along with the thought "what the hell took them so long to put up something so obvious." To delete in the past you had to hit the drop-down "More Actions..." and select Delete from the list, after marking one or more messages. I can only assume that the original design was to enforce the idea that with Google you didn't have to delete any messages. But there are times when you need to clean out the cruft, and creating filters (as opposed to folders) to organize the good out of the bad is a pain in the rear. It's just easier to go in and clean out older messages that no longer have any relevance. Like cleaning out the garage.


So that's what I've been doing all this time


I don't think I fit in the category of "pretty good". But I'd like to think I rise a bit above the "noise and bile" level. At least on some of my posts. The one about Apple probably fits in the "noise and bile" category. But reliving Usenet? Only if I start writing material appropriate for alt.flame.

Friday, January 27, 2006

Dodging Road Apples

First there was the January launch of the Apple/Intel systems (lovingly called MacIntelitosh by TheInquirer as well as many others). We got to hear Jobs hisself announce how the new Intel Macs were 2x to 4x faster than the equivalent PowerPC machines. This, after years of hearing how much faster the PowerPC machines were compared to equivalent Intel-based machines.

Next came MacWorld's analysis of the iMac. They were a little more critical of Apple's claims, showing a 10% to 25% increase, but not the 2x claimed by Apple. That led to an enthusiast site, MacSpeedZone, to claim that the new iMac was 2x faster. What's peculiar about the MacSpeedZone is their example of why MacWorld's analysis was flawed. They compare a QuickTime encode on a quad G5 with the iMac Core Duo. Let me quote:
When running a QuickTime encode the Power Mac Quad G5/2.5GHz took 84.85 seconds. The Intel iMac Core Duo 2.0GHz took 97.02 seconds
Advantage: Power Mac by 14% .... Nothing to write home about ... Not even keeping up with the clock-speed difference between the two machines

Not convinced .... I wasn't either ... Ok lets try something different. Lets run two encodes at the same time .... just for fun. It is easy to do, just duplicate the file and run the processes concurrently.

What scores did we get?

When running the QuickTime encodes the Power Mac Quad G5/2.5GHz took 86.25 seconds. The Intel iMac Core Duo 2.0GHz took 176.60 seconds
Advantage: Power Mac by 105%

Ok let's get this straight when doing twice the work it only takes the Power Mac with its four processor cores, about 2 extra seconds, but takes the Intel iMac an extra whopping 79+ seconds - almost twice as long as in the single test?

What's wrong with this picture? What's wrong is processor capacity vs processor usage.
No, there's nothing wrong with the picture. Later on in the article the author notes that a single Intel processor was being 87% utilized while the PowerPC core was being 42% utilized. When the single cores on both machines were given the two encode tasks, the Intel core hit 100% utilization while the PowerPC only hit 87%. This illustrates to me the inherent superiority of the G5 core over the Intel. The rest of the MacSpeedZone is so much poorly written crap. Bottom line: the Core Duo is no better, and is probably worse, than the PowerPC chips. But what the hey. If Jobs says the new MacIntelitosh is better, then it must be.

Finally there's this interesting article from ZD Net Australia about latent flaws in Mac OS X. Suresec is finding bugs fixed in other operating systems (read Windows and Linux) 10 or more years ago. The kind of bugs that, when exploited, allow non-admin users to gain admin rights to the machine. Read the article. Maybe you'll laugh as hard as the Apple developers did when they were told about one such trivial bug. Or maybe not.

I thought at one time that Apple could provide a better alternative to Windows and Linux. Better quality, better security, better hardware. A better overall experience. I had even convinced myself I should switch. But as I've watched this move by Apple from PowerPC to Intel, it's become quite evident that Apple is only interested in style over substance. With what I've read and seen I'd just as soon stick with standard Intel systems and operating systems I already know, such as Linux and Windows. Let those with more money than sense buy Apple. I'll stick with Intel's "boring" partners running other operating systems I know are more secure. I've got too much work to do to play with Apple.

I Remember Challenger

There have been a spate of articles about the Challenger accident of 25 years ago. I'm not going to link to them; you can Google for them on your own. I will, however, mention "7 myths about the Challenger shuttle disaster" by James Oberg. Myth #1 concerns how many people watched the disaster. I was working for Martin Marietta (now Lockheed/Martin) at the Lake Underhill facility in east Orlando, Florida. I was outside that morning in January with a whole group of engineers and technicians waiting to watch the launch. We were standing on the patio outside the main cafeteria. It was a good clear morning. There was quiet chatter among the waiting group until the launch.

Challenger lifted off as expected, and we watched as its liftoff trail climbed above the pine trees into the sky. Then about a minute into the liftoff came the puff of smoke at the top of the liftoff trail. There was no more liftoff. We stood outside for about a minute more. There were one or two "What happened?" questions, but no-one said anything else. Then we turned around and headed back inside. I started looking for a TV that I could tune to a local channel to check the news. About 15 minutes later all the TVs in the plant started to carry the news (I don't remember which channel). I spent the rest of the day numb.

There followed the investigation and the news stories and the books and the accusations of cover-up. I don't care. All I know is that on that day we lost 7 good astronauts and a very expensive spaceship.

Thursday, January 26, 2006

Java SE 6 (Mustang) delayed until August of this year

Ray Gans, a senior program manager at Sun Microsystems, has written a blog entry about Java 6 titled "Where We Are With the JDK." He lays out a road map leading up to the final release of Java 6, which includes two betas (the first in February, the second in the summer) and with a final release in August. The slip from a mid-2006 release to August 2006 is the change that seems to have everyone's attention. I personally don't care. Most folks on the commercial side are still moving over to Java 5, with a very large percentage still on Java 4.2 (or 1.4.2 using the older numbering scheme).

The biggest reason for delay seems to be the rewrite of the class loader. I had wondered at the time it was announced how much testing would take place with this significant change. And now I know: lots and lots. They must have run into problems already because of the comment in the blog where Sun wants to "address some issues in sensitive areas of the codebase (e.g., the classloader) and want to be certain these changes won't break anyone's code." Re-writing the classloader is not to be lightly undertaken. When you make such a profound change you'd better test to exhaustion. Some serious bugs must have been filed against this change. Oh well.

Another change that I appreciate and use has been better text rendering, especially on flat-panel monitors. I use Java 6 on a Gateway notebook under both Windows XP and Linux (SuSE 10). I use it with NetBeans 5 to get the sub-pixel anti-alias rendering for text in the editor. It's what makes NetBeans 5 tolerable as an editor, and makes it competitive with Eclipse on those platforms.

I'm satisfied with the delay. It will lead to an even better Java 6 at launch. My only complaint is that the first beta is based on build 59 of the weekly snapshots. I'm currently using build 68 (released January 19th). I've been using Mustang for about six months, going to the binary snapshot site and picking up the drops as they happen. Java 6 snapshots have been very robust for some time now, as well as extremely fast. I suppose if I have to ship a preliminary Java release running against Java 6 that I'll use the official beta, but for day-to-day work and development there's no reason for me to stop using the snapshots. I'm excited about Java 6 and look forward to its final release.

Saturday, January 14, 2006

The Problems of Being Dell

First, let's start with this from slashdot, that pillar of incisive technological journalism:
Nine years after Michael Dell said he'd shut down Apple and give the money to the shareholders, Apple has passed Dell in market value, at $72,132,428,843 compared to Dell's $71,970,702,760. Analysts expect Apple to continue to outperform competitors, citing 2006 as 'poised to be the year of both iPod growth and, more importantly, Mac market share gains,' with earnings growing more than 35%.
And then there's quite strong Dell opinions from AMDZone. In the process of boosting his favorite brand of microprocessors, Chris Tom also takes quite a few swings at all things Dull (Dell) and Intel (including, now, Apple). For example:
Forbes reports on Mac sales dropping. It seems AMD is now perhaps not helping them in that regard. Also I have been quiet about this largely due to how obvious it is, but Steve Jobs is the biggest liar in the computer world on the face of the planet. Now Intel is faster than the IBM G5. Ok. Anyone that believes a thing that comes out of his mouth, including business partners, needs some serious counseling. He has so far surpassed the complete bull we hear out of Dull brass that it is almost unfathomable.

Advanced Micro Devices ... may also prove to be a tough competitor, Stahlman added. "As impressive as the computer-intensive benchmarks offered by Apple might be, there is no way to avoid the fact that Intel's Core Duo processor (aka 'Yonah') is a 32-bit engine that is fundamentally obsolete in a 64-bit x86 world. As AMD ramps to ship its dual-core 64-bit notebook CPU's ... we believe Apple may lose some of its premium luster," the research analyst said.

It should be noted that Chris Tom is a former Dell employee. The details of his former employment and the situation that led to his leaving Dell are best found on his site. But Chris is no neutral bystander to all of this. Far from it. It just incentivises him to go after the facts. Like a pit bull after a lamb.

So is Dell about to become Apple's roadkill? Are we on the cusp of seeing Apple ascendent over all? I don't think so.

First, I have to agree in principal with Chris' comment that Steve is less than truthful. I've dealt with Apple since the Apple ][ days on through the early Macintosh days. I stopped buying Apple in the mid-80's precisely because of Steve Job's antics and until very recently (because of the hassles of Windows security and Linux installs) haven't considered buying Apple.

When I switched to PCs I first purchased IBM, then started to buy white-box systems, then slipped over to DIY. I went the white-box and DIY route because that was all I could afford at the time. When I thought about buying 'whole' PCs, especially in the late 80's and early 90's, I thought of Compaq or HP, never Dell.

The reason was Dell was spotty in its quality through the 80's. Some models were excellent, while others were just junk. And Dell was always working on the trailing technology edge. Yes, Dell was cheaper, but you got what you paid for. Then in the 90's Dell started to work on its image and make a very big push into corporate accounts. I started to run into Dell systems everywhere in 1995. They were always big, white, boxy things with never quite enough memory or disk space to scale into real work. But they sold like hotcakes to the bean counters in the company because they were "value" systems, and at the time Dell went out of its way to provide admittedly good customer service and support.

But as the years rolled along Dell, to increase profitability, off-shored customer server to the point where it's now mediocre at best (especially if you're not a corporate customer), and the systems they move are crippled simply by being exclusively Intel inside. Yes, Dell will decline in front of Apple's onslaught not because Apple is truly superior but because Dell has become that bad. It is Dell's game to loose, not Apple's to win.

I mentioned earlier I looked at Apple, primarily because of OSX. After having looked at the Macinteltosh MacBook Pro, I have to say that it's underwhelming in the face of what you can get from other manufacturers, such as Gateway, for the same amount of money. The Gateway M685 will come with a faster processor, a better graphics system, a better DVD burner... you get the idea.

And what about those lovely Apple iPods? Again, let me speak from personal experience. Both of my daughters don't want iPods because they don't want Apple. They grew tired of Apple in school, and they've watched how their friends are limited by their iPods. My oldest doesn't care, having decided a long time ago she could mix and match tracks on CDROMs and create her own CD collections from CDs she purchased from the store. My youngest and I use a Memorex MMP8570. Its key features for us were 512MB of built-in flash, expandable via SD cards, its ability to look like a USB drive so that you could drag-and-drop tracks to it under Windows (or Linux), and for me, it has a built-in FM radio for listening to the news and weather when the hurricanes come roaring ashore. We rip tracks from CDs we own and just drop them on the Memorex and we're good to go. In short, we don't have a need to prove we're "super L337" consumers by buying Apple (or anything else, for that matter).

Update
How Apple Could Mess Up, Again
These days it's hard to find a pundit willing to question Apple Computer's (AAPL) long-term prospects or the calls of its famous CEO, Steve Jobs. After all, Apple's fortunes have been on the rise for nearly a half-decade now, and they seem to be only gaining steam.

That has caused even some of the most devoted skeptics of years past to stop fretting over Apple's future. For years, many felt that Apple's past mistakes were bound to come back to haunt the Cupertino (Calif.) company -- the refusal to license the Mac OS in the 1980s; the stale products, bloated expenses, and management turmoil that hobbled it in the mid-1990s; the software availability and falling market share that plagued it right into the 21st century. These days, with Apple's stock price the talk of Wall St. and its products once again defining techno-chic, all that's a distant memory.

That is, unless you're Clayton M. Christensen, the Harvard professor and author of the seminal 1997 book The Innovator's Dilemma. Christensen, who more recently wrote Seeing What's Next: Using Theories of Innovation to Predict Industry Change, isn't willing to jump on the Apple bandwagon just yet. As well as Jobs & Co. is performing now, Christensen fears that success is built on a strategy that won't stand the test of time.

When you finish the article, make sure to read the pithy comments to the author from the Apple fanboys and -girls at the bottom. And I thought Linux fanboys were virulent.

The NXT Generation - Lego Upgrades Mindstorms

I've played with Legos for a long time, and I've played with the Lego Mindstorms Robot Invention System (RIS) V1 for nearly three years. It figures that right about the time I find the parts to build my own Legway that Lego announces its next generation robot kit, the NXT.

First shown at this years CES in Las Vegas, the Lego NXT is interesting in that it appears Lego is taking the lessons it learned with the original Mindstorms to create a new robot kit that offers greater end-user satisfaction. There was a TechUser article about Lego Mindstorms (Lego Mindstorms: What Went Wrong?) that attempted to analyze why the first RIS "failed". The article is interesting because there is a link to a very good response from a reader (Andy Toone) who makes a number of alternate points as to why the original RIS has not been successful. I quote parts of the message below.
I can think of another explanation for the problems with Mindstorms sets - the average user experience is not actually as compelling as you'd think... However, the average user has just spent two hundred bucks on a kit and may be disappointed to discover that (a) programming is not as easy as they thought and (b) with only a limited number of motors and sensors, getting the RIS to do something interesting requires an above average level of ingenuity...

Combine software and hardware and the problem is huge. Mindstorm owners must be able to set up their PC, understand programming to some extent, understand the concept of downloading and independent program execution, build models that use a number of mechanical principles to do something effective, and manage a limited number of components to create those models. Before all this can come into play, they must have the imagination, skill and judgment to come up with a project that is suitable to the tools they have to hand. Mindstorms certainly rewards ingenuity, but sadly it demands ingenuity as well.

So, my suggestion is that the average user finds they have spent as much as they would on a games console - but rather than limitless possibilities and a wide variety of pre-packaged experiences, they discover that they are very reliant on Lego to give them further instruction and that even with investment in additional kits, their experience doesn't become significantly deeper. The Mindstorm kit gets put away and having bought in at the pinnacle of the technology, they are unwilling or unlikely to buy other Lego items.

Lego themselves must also be aware that with limited sensors and actuators, the variety of robots they can offer to the user is not as great as they'd like. If you examine the hobby robotics scene on the internet, you'll swiftly discover that if you discount robots that use very sophisticated hardware and/or software, you are limited to relatively simplistic behavior. Without additional specialized components (which FischerTechnik benefits from), Mindstorms kits rarely go beyond the 'programmable buggy' model.
A lot of what Andy writes about resonates with my experiences, especially the part about specialized components. It took HiTechnic's EPOD sensor to build the interesting Legway, as well as adding the BrickOS firmware and gnu toolchain to program the Legway to use the sensors and provide the necessary behaviors. This is far and above the basic capabilities of the vanilla RIS.

It appears that NXT attempts to address these issues, and perhaps a little more. For example, the NXT Intelligent Brick has a 32-bit processor, more memory, and flash. This immediately brings to mind an ARM-based systems along the lines of the Gumstix. Gumstix sells a $99 board no bigger than a stick of chewing gum with a 200MHz XScale ARM processor, 64MB of RAM, and 4MB of flash. It runs Linux. If the NXT Intelligent Brick does indeed have an ARM with sufficient RAM and FLASH (especially at today's low costs for all components) and if Lego supports the enthusiast community they way they did with the RIS, then the possibilities for writing good sophisticated software are very good.

And there are the new sensors being added to the NXT: rotational sensors on the motors (shaft position encoding), a simple sound sensor, a sonar positional sensor, enhanced touch sensors, and enhanced light sensors that sound a lot like what the HiTechnic sensors provide. Throw in USB (for programming I'm sure) and Bluetooth (for wireless programming or control from a PDA or cellphone) and you have the potential for a very interesting robotics kit that provides considerably more features and capabilities to build with. I could go on repeating what's already been written here by Lego, but I'll leave it to you to follow the link and read on.

Don't get me wrong. I was, and continue to be very happy with the RIS. I've cobbled together quite a few 'inventions' from the original kit. But it took the gnu toolchain to really get under the hood and take the next interesting steps, and if HiTechnics hadn't stopped making its Lego sensors for a few years, I'd have taken it even further. 2006 is shaping up to be a very exciting year for the robotics hobbyist, especially for me. New kits with sophisticated actuators, sensors, and components from companies such as Lego, Vex, and Robotis are being released. Even HiTechnic is back in business, getting ready to release new versions of its Lego sensors for the NXT. I believe that this year, more than any other, will be the tipping point for inexpensive, affordable general robotics experimentation in much the same way that 1975 was a tipping point for personal computers. It's already started with me using Gumstix and Vex, and it will continue with Lego NXT. With all that I see coming, it's going to be an exciting next few years, if not decades.

Other Sites of Interest
There's another blog, nxtbot.com, where the author traveled to CES and covered the Lego NXT, among other interesting devices. The two articles on his site cover the software programming environment, which appears far more flexible and sophisticated than the RIS' original system.

Thursday, January 12, 2006

I wanna Robotis

I've been a hobbyist roboticist since I was a kid (along with building Heathkits and other electronic gadgets from Radio Shack parts packs). In those early days I always had to satisfy that itch with little or no money. As a consequence most of my 'robots' tended to look like little more than animated junk. But that was the past, and the present allows me to spend a little more cash on very complete robot kits such as the Robotis Robot Kits.

I just read about Robotis on The Inquirer. Generally I don't give links to The Inq on general principals, but this time it was too hard to resist. According the The Inq, the kits come in two versions with price tags of $350 and $900. I think I would actually spring for the $900 kit myself. After all, I'm the same guy who plays with his Lego Mindstorms kit as well as the Vex Robotics Design System. According to The Inq the kits are not for sale in the U.S. I can hardly wait for them to go on sale.

Wednesday, January 11, 2006

The Coming of NetBeans 5 RC1

There's been chatter on the NetBean blogs about NB 5 RC1 coming out "real soon now". It should be out this week if I can believe what I read in a blog's comments here. In any event the daily releases stopped January 5th, indicating that something is up. What's more, if you go to the NetBean's download page and look under development builds, you'll find a new entry (for me, anyway) in the release versions dropdown for NetBeans 5.1, with a whole list of daily releases right up to the date of this posting (January 11th).

I've been living pretty solidly with NB5 since beta 2. I think I made the switch from Eclipse to NB without realizing it. I can't pin it on any one reason, but what helped tip me over to NB5 was it's superior Emacs key bindings in the editor. Another is that Matisse finally works for me. It hit a high level of stability (especially under the latest builds on SuSE Linux) and I suspect I also learned to really use it well by un-learning some habits acquired using other equivalent tools. Matisse and I grew towards one another.

I wish Eclipse and its supporters the best of luck. For me it's grown too big and complicated. I now realize the last good, fast version was probably 2.1.3. I think that's the sweet spot that NB5 has finally hit; fast and with useful features. There's definitely a number of areas where NB5 can be polished up (up-to-date documentation on internals for starters), but hey, it's open source and it's free. I just hope that success doesn't ruin NetBeans in the future the way it seems to have ruined Eclipse right now.

Tuesday, January 10, 2006

Me and my Legway

Two years ago Steve Hassenplug created a two-wheeled self-balancing robot using Lego Mindstorms and a pair of special sensors manufactured by HiTechnic. It was the same Lego Mindstorms that I had purchased that previous Christmas. Unfortunately when I finally found out about Steve's work it was nearly a year later, and a key component of his Legway, the HiTechnic EPOD Sensor, was no longer being made. Jump forward to November 2005, and I find out that HiTechnic is again building and selling the sensors, so I immediately ordered two for myself.

While waiting for the sensors to arrive, I started to look at the other bits needed to build the Legway. Two years is an eternity in computing. Key software and support systems had evolved since the original Legway was produced. Elements of the Legway that had changed and needed to be updated were:
  • I used the Linux environment on my Gateway to build and use the BrickOS software. When I attempted to put the software environment together I noticed the original tool chain used gcc 2.95. When I went looking for more up-to-date tools, I only found them for Linux. I followed Matthias Ehmann's directions for installing and building the BrickOS build environment under Linux. Let me emphasize that you use exactly what is called for in the directions. I tried gcc 3.4.5 with matching binutils. Using gcc 4.0.2 to build the gcc 3.4.5-based cross-compiler generated an ICE under 4.0.2. Gcc 3.4.3 successful builds as a cross-compiler under gcc 4.0.2.
  • The Lego Mindstorms I/R tower. The tower originally connected to a personal computer via to a nine-pin RS232 serial port. The system I used to program my RCX 1.0 brick was a Gateway notebook that did not have any serial ports. Instead, the Gateway has four USB 2.0 ports (eight connected to the dock), so I used a Dynex USB-to-RS232 adapter to communicate with the tower. This worked out very well under Linux, as the device showed up as /dev/ttyUSB0.
  • The original legway.c program from Steve's site was written for an earlier version of BrickOS. One change in particular has to be made to the code for it to compile and work properly with the current BrickOS: replace sys_time with long sys_time = get_system_up_time();
Everything was built and operated under Linux. Once I downloaded the firmware and then the compiled legway.c program, it 'stood up' and ran like a champ.


Legway lying on its side exposing the HiTechnic sensors. You can see them above each wheel.


Legway up and dancing around the desktop. The sensors are the gray blocks at the bottom with the wires coming out of their tops.


The RS232 tower connected to the Dynex USB adapter. I picked mine up at a local Best Buy.

Wednesday, January 04, 2006

Smoke Testing Kernel 2.6.15

Kernel 2.6.15 became available January 3rd. I downloaded it and did the build and install dance to test it out on my Gateway under SuSE 10. What follows are my initial impressions.
  • Overall quality appears to be very good. It was built using gcc 4.0.2. I use Grub because it's very simple to add a new kernel to /boot and then to edit /boot/grub/menu.lst and add a new stanza for the new kernel. What is interesting is that the 'pci=noacpi' boot property for kernel 2.6.13-15.7 (the SuSE kernel) is no longer needed. I know that the boot property is needed for the stock 2.6.13 kernel as well as 2.6.14 kernel on the Gateway.
  • In spite of what I've read, you have to build and install both Broadcom drivers (Tigon3 and NetXtremeII). If you only install the NetXtremeII driver then Broadcom networking will not come up (Device Drivers > Network device support > Ethernet (1000 Mbit)).
  • The Intel PRO/Wireless 2200BG support is in the kernel and appears to at least start up. The problem is that the driver has been compiled with version 19 of the wireless extensions. The wireless tools that ship and install with SuSE 10 only support up to version 18. This means that if I want to turn on wireless when I boot via the Gnome applet or to successfully manipulate the wireless interface with iwconfig, then I'll need to find and upgrade to the current set of tools. Also, a note to the first time kernel configurer: You have to enable Networking > Generic IEEE 802.11 Networking Stack before you will see an entry for the Intel chip under Device Drivers > Network device support > Wireless Lan.
  • I use a Logitech 4000 Pro webcam. The driver that comes out-of-the-kernel does not work, so I have to always go and download the Phillips USB Webcam Driver from Saillard.org. I've done it for kernels 2.6.13 and 2.6.14, and it continues on for 2.6.15.
  • ATI 3D hardware acceleration is broken. An attempt to use my current ATI driver to work with this kernel results in a 'Unknown symbol verify_area' message when I attempt to load the module. This means downloading and installing the latest ATI driver which I believe corrects this problem.
So far everything else works fine. The problems I've touched on are more interesting than annoying, and are to be expected when you go and tinker with the kernel. More importantly they give me a heads up for the future time when I will most likely upgrade from SuSE 10 to SuSE 10.1.

Monday, January 02, 2006

Running SuSE 10 on a Gateway M680XL Notebook

It's been three months since SuSE 10 was released. There have been a number of reviews of this release, generally all positive. The problem with release reviews is that there is no time for the author to work with the release for an extended period of time. It's during this "baking" process where good and bad features settle out and where you learn what really works and what doesn't.

My adventures with Linux began a long time ago, and has continued (on and off) for the last 13 years. My most current Linux installation is on a Gateway M680XL notebook with the following features:
  • Intel Pentium M 780 (2.13GHz)
  • 1GB DRAM
  • 100GB 5400 RPM HD
  • ATI Mobility X700 128MB video driving a 17" 1680 x 1050 LCD (WXGA TFT)
  • Broadcom eXtreme Gigabit Ethernet and Intel PRO/Wireless 2200BG
  • USB 2.0, IEEE 1394
  • Windows XP SP2
I received the notebook on July 5th 2005 with Windows XP SP2 installed. As delivered, 'Everything Just Worked'. New hardware was properly detected and drivers (if needed) properly installed. I did as much software development and engineering as possible under Windows XP until I needed to move over to Linux. I started the long process of installing Linux on the Gateway in mid-August 2005.

When I got ready to install Linux on the notebook I violated one of my own cardinal rules with regards to Linux: Boot the target machine with a live file system (DVD or CD/ROM) of the distribution you intend to install. I selected Fedora Core 4 because I had it running on a desktop machine at home and felt I knew enough about the distribution to successfully install it on the notebook. I thought I could get away with not booting a live version of FC4 to check any peculiarities. Boy, was that a wrong assumption.

Before installing Linux on the Gateway, I purchased Partition Magic 8 and used it to re-partitioned the drive. I broke the drives 100GB space into 64GB for Windows and 30GB for Linux. In the illustration below partition hda2 is the Gateway recovery partition (FAT32), hda1 is Windows XP (NTSP), hda3 is the Linux root (/) partition, hda5 is swap, and hda6 is mounted on /home. All the Linux partitions are ext3.



Note the physical location of hda2 with respect to hda1. When I installed FC4, it installed Grub pointing to the first physical partition, not the first logical partition. And I didn't catch it. Because Grub booted into hda2 after the initial FC4 installation, and because I didn't pay attention, I was believed I had corrupted the Windows partition and the Gateway was booting into recovery mode because of it. It wasn't until after I elected to recover and reboot that I found Grub was 'stuck' pointing at hda2. After booting into FC4 and fixing Grub's boot menu, I was able to recover the Windows mess I made. But I never corrupted the Windows partition.

There were other problems with FC4 besides its initial configuration of Grub. Its kernel failed to fully enumerate and operate all the USB devices on the notebook. I have two Logitech USB wireless mice that work just fine under Windows but that were not recognized under FC4. Even a wired USB mouse failed detection. The Gateway's built-in mouse was barely working. Both the wired and wireless network devices did not work. The screen resolution was only 800 x 600. After working for two days trying to resolve the issues via Google on another machine, I gave up and replaced it with SuSE 9.3 Professional.

FC4 was not the only Linux distribution I had running. I have a second system that dual boots between SuSE and Windows XP. I'd installed SuSE on the WinXP machine because I wanted to become as familiar with it as I was with FC, and because SuSE was a 'pure' KDE distribution as compared to FC's 'pure' Gnome UI.

With the deadline clock ticking I wasted no time and installed SuSE 9.3 over FC4. This time I made sure Grub was properly configured by installation; it was. After the initial installation SuSE found the Broadcom Ethernet wired connection but still didn't setup the Intel 2200BG wireless hardware. The resolution on the screen went up to 1024 x 768. Frustratingly SuSE 9.3 still had problems with the USB subsystem. I kept trying to resolve the SuSE 9.3 problems for the rest of the week. Because SuSE did a better job on the notebook than FC4, I took a chance on downloading and installing SuSE 10 Beta 3 over 9.3.

SuSE 10 Beta 3 correctly detected and enabled all hardware with the exception of the display. Maximum resolution was still 1024 x 768. To fix that problem I downloaded and installed the ATI RPM. I then ran fglrxconfig, rebooted the system, and came up with proper 2D resolution (1680 x 1050) and 3D hardware acceleration. I used Beta 3 until the final release of SuSE 10 at which time I downloaded the DVD ISO and installed SuSE 10 Eval. I did this while waiting for Novel to release the boxed set of SuSE 10. I've purchased SuSE boxed sets since 7.3 (I also purchased Redhat boxed sets, then Redhat yearly subscriptions, until they dropped the retail desktop version in 2003).

One nasty problem showed up with the final release. For whatever reason, the initial boot of SuSE 10 Eval turns off the DVD drive and causes the rest of the installation to fail because it can't read from the DVD. I fixed that problem by adding 'pci=noacpi' to the kernel boot parameters. Booting was not a problem with Beta 3. Once the boot problem was fixed the long trek to install SuSE 10 (or any decent Linux distribution for that matter) on the Gateway came to a reasonable close.

Sample Screen Shots
Over the past few months I've tuned both Gnome and KDE desktops for my use. I use Gnome for Java 6 and NetBeans 5, because the latest development version of Java 6 will only provide good sub-pixel anti-aliased text under Gnome. Even Firefox looks better (its tabs). What follows are a Gnome screenshot and a KDE screenshot.





Issues with Linux on the Gateway
  • Video Support. In spite of using the latest Xorg (or perhaps because of it), support for current ATI notebook video chipsets is poor. I have tried to boot other live CDs of Linux, most notably Ubuntu (5.10 and 6.x) and Knoppix (4.0.2). Linux either fails to boot into X (such as Ubuntu) or it comes up in a degraded screen resolution (such as Knoppix). I've filed several Bugzilla reports (one with Ubuntu) and the support has been barely useful. The best advice I can give is to find a distribution that boots X into degraded screen mode and then install the ATI driver for full hardware support.
  • ATI Driver Support. The SuSE 10 kernel has been updated a number of times since I've installed it, and on at least two occasions the ATI driver 'broke' because of an internal kernel API changed, breaking the driver. Thankfully I was able to download a newer ATI driver and upgrade, but it begs the question: why? Because of these experiences, whenever I read an article like this where Linus tells Jeff Merkey 'cry me a river', I have to resist the almost overpowering urge to wipe Linux off the system and stick with Windows. Jeff certainly can't wrap himself in glory over his many lame escapades, but he does raise an important point about changing the kernel's internal API. Don't. Bug fixes inside a given kernel release are A Good Thing. Architectural changes that break third-party drivers on a regular basis Are Not.
  • Wireless Hardware Support. It's poorly implemented. Under SuSE 10 it is provided by SuSE with Intel software bits. When SuSE boots I get to read (via dmesg) kernel tainted messages when the wireless module is loaded. It's true that the latest kernel (2.6.15-rc7) has it built-in, but that version of the driver does not work with the current wireless tools as provided in SuSE 10. Yes, I built and booted and tested 2.6.15-rc7.
  • Wireless Support in General. Under Windows you can have multiple wireless access points. That allows you to travel around a building, a city, or the country and simply pick up and use a new access point without losing track of others that you've visited in the past. You don't have that with SuSE 10, and I have yet to see it handled as easily with any other current Linux distribution.
  • Poor USB Mouse Support. Yes, all my USB devices are found if they're plugged in at least once. But I have a problem with USB mice. For example, if I am using my system under SuSE in my office and then undock my notebook to take it into a conference room without turning it off, I loose the ability to plug in the portable Logitech wireless mouse. I don't use the built-in mouse pad; in fact I have it disabled under both Windows and Linux. My hands are big and my fat thumbs cause tremendous problems when I'm typing under Linux because I'm touching the mouse pad. But if I disconnect USB mice in any fashion and plug them back in, they no longer work. I have to log out and back in again to get it back (sometimes I have to reboot X).
  • Poor Gnome Flexibility and Quality. I often wind up under Gnome if I want a better user experience with Java 6 and NetBeans 5, or Eclipse, or even Firefox. I use Gnome because I have to, not because I want to. I find Gnome to be difficult to configure, and the default themes leave a lot to be desired (KDE has an equally poor default theme, but at least I can quickly tweak it into something reasonable). What amazes me is that I can't change the color or sizes of the window borders. I wasted a fair amount of time downloading themes from GNOME-Look to find one that looked and worked half-way decently. Gnome also has startup problems; the occasional spate of panel applet failures when first logging in. And then there's a number of broken Gnome applications, such as Totem. Totem can't play anything on my notebook. I mentioned earlier that Firefox looked better under Gnome. Unfortunately Firefox does not operate better. With Firefox 1 and 1.5 I have extensive link menus beneath the toolbar. Under Gnome the menus fail to operate properly after about a half day of work.
End Comments
I have yet to give any Linux distribution my unconditional recommendation for desktop use. I don't hand it to anyone unless I rattle off a host of caveats and give them an arm full of live CD/DVD disks to try out on their hardware. Some times they get lucky and everything Just Works. Sometimes, like me, they get bit.

My SuSE 10 installation is currently stable enough that I can get real work done on a daily basis. However, my Linux experience is not unique. It's Linux experiences like mine with contemporary hardware that have kept Linux off the desktop for many would-be users. The science and art of operating systems has reached a point where it just has to work out of the box. There can be no more excuses, no more apologists for the shortcomings. Being free (as in speech) and open isn't good enough any more, especially if Linux desktop vendors want to sell to more high-end users such as myself.

Sunday, January 01, 2006

The New Year Arrives

New Years came quietly in my small part of the world. Yes, Universal and Disney had their 10 minutes of fireworks at midnight and I watched some if it, but for the most part I read or worked on an endless list of errands around the house. Even my two teen-age daughters decided to stay home and watch the partying on TV before hitting the sack. I slept in this morning until around 9, then got up and did more putzing around the house. The Labs were lazing about more than usual (that's Max on the right airing out his package).

The gaps between the postings are growing greater and greater. It's hard to write well, and it's even harder to write something, anything, that is unique and worth somebody's time. The problem is due to my lack of professional talent and experience as well as the continuing onslaught of literally everybody and their relative blogging away.

I'm going to limit my posting to technical issues. I've been living for five months now with SuSE Linux 10 on my notebook. I've been keeping up (more or less) with NetBeans, and it continues to improve. There's been an interesting update to the visual development tool for Eclipse that I need to look at, and see how it compares with Matisse.

Oh well. If I write, will anybody even notice?