Thursday, May 31, 2007

Trolltech's Qt 4.3.0 really begins to dazzle

It's not hard to impress me with new visual software. I love eye candy, the flashier the better. So it should come as no surprise that I'm awfully impressed with the latest version of the Qt framework, 4.3.0. I installed it on three systems for a quick and dirty evaluation; algol (my XP Core Duo notebook), europa (my OpenSuse 10.2 Athlon XP system), and my daughter's Toshiba Satellite A135 notebook running Vista Home Premium. I've got screen shots from XP and Suse, but I didn't bother to do a Vista capture. I'll explain why later.

Installing Qt Windows Open Source Edition is dead simple. Download and click on the installation binary. It will install the complete Qt set of tools and examples, and if you don't have it installed already, it will also install a copy of MinGW to compile applications with Qt. Note that the Windows version of the Qt framework does not have to be compiled. Everything is pre-built and ready to use. The screen shot below is the application QtDemo with three of the demo applications launched and arranged around it.

One really interesting new feature is the integration of Javascript into the framework. You can see the Javascript demo running in the very small window at the bottom of the screen. It's displaying an analog clock. That clock is being setup in Javascript, and rendered using calls to Qt's underlying graphics primitives. Very slick.

Installing Qt X11 Open Source Edition and getting it to work takes a little more effort. Qt is the foundation for KDE, so it's usually already installed (minus the examples and demos). To get the absolute latest you download the sources, run configure, make, and make install. No big deal, but with this framework it takes some number of hours on europa. Once built and installed it's then available for demonstrations and further development.

When I built Qt 4.3.0 I decided to build it with the latest version of gcc, 4.2.0. I figured I'd test the latest gcc's C++ compilation capabilities with a reasonably non-trivial set of source code. I did that, it succeeded in building, and then QtDemo failed to execute. It died with a signal 11. You can see where it segfaulted (via gdb) in the window below.

Well, I knew I was pushing my luck by compiling the latest Qt with the latest gcc. So I simply changed my environment to use the older gcc (the default 4.1.2 for Suse 10.2) and moved the binaries and installation built with gcc 4.2.0 out of the way, as it were. Then I uncompressed Qt again, and rebuilt everything with gcc 4.1.2. Sure enough, QtDemo ran just fine (as you can see below). I don't know if I should file a bug report with Trolltech or the gcc developers, but right now it looks like Qt 4.3.0 has compilation issues with gcc 4.2.0. I'd have to do a fair amount of debugging to see if there really is a compiler problem or if there's a linker issue between executables and libraries built with gcc 4.2.0 linked against other libraries in the Suse distribution built with gcc 4.1.2.

Once I got Qt rebuilt with 4.1.2 and installed I fired up QtDemo and started four of its demo applications. Three of them match the same demo apps I started on Windows. The forth was a little Bit Torrent client in the upper left corner of the screen. I decided to download the latest Fedora release with it, Fedora 7 Live CD. And it worked. Yes, it's very primitive compared to Azureus and KTorrent, but the fact it was included demonstrates the depth and breadth of the Qt demonstration applications as well as the power of the framework.

I should also note that all the full-screen screen shots were made with the Screenshot demo app (Desktop | Screenshot).

Observations

The best 'experience' was under Windows XP. Every application was pre-built and every demo ran, especially the OpenGL demos. What's more, the bouncing green balls only appeared on the QtDemo main page when it ran under Windows XP. That effect was missing on both the Suse and Vista installations.

Under OpenSuse 10.2, everything built except the OpenGL demos. I'm still trying to find out why. OpenGL demos have built under every other Qt release, such as 4.2.3. Other than OpenGL everything built and runs just fine. What was nice is that it picked up my OpenSuse KDE theme and used it where appropriate. Everything ran as smoothly under OpenSuse as it did Windows XP. That's interesting to me as OpenSuse runs on a single core Athlon XP (32-bit), while my Windows XP system is powered by a Core Duo. It will take more sophisticated applications to begin to see differences between the two platforms.

Of the three, the Vista installation was the least satisfying. None of the OpenGL demos worked. I'm none too impressed with the Vista theme. It looks too much like a bad Beryl theme. That's why I didn't bother to grab any screen shots from Vista. It would have been pointless.

I expect this version of Qt to really shine on Linux, Windows and Mac OS X. If you haven't stepped up from Qt 3 to Qt 4, then this is the version to seriously consider switching to. And if you've got an older version of Qt 4, then this is most definitely the version to update to. I'll have more to write about this release of Qt in the near future.

An Update to "I Am Not A Lawyer"

I wrote an opinion piece in which I flamed Thomas R. Nicely's assertion that Microsoft, through nefarious means, was forcing everyone to use nothing but Microsoft development tools under Vista. Nicely was using an obsolete version of gcc and djgpp to build applications, and Nicely had found a bug in that combination that kept him from allocating more than 32MiB under Vista. His sample code did indeed fail under Vista (and Windows Server 2003 as I discovered) when built with his tool chain. Lewis Metler (lamlaw) picked it up and used it to bolster his case against Microsoft, wondering if this was yet another attempt by Microsoft to further their monopolistic lock. Metler's done a lot of good writing about Microsoft's abuses and showcasing excellent examples, but Nicely's page is not one of them.

At the time of my original post I stated that Nicely's problems were operator error and feet dragging on his part. In his original post, Nicely stated he was using gcc 3.0.2 and djgpp 2.03. I pointed out that either of those tools (if not both) were long in the tooth and contributed to his problems.

And so it stayed until I got a comment on the entry a few days ago, where the commenter noted that Nicely stated he was using gcc 4.12 (instead of 4.1.2; call me anal, but version numbers should be properly stated). I went back, and sure enough, his original page had been updated and more information added. He also added points to the end of his page lamenting how people felt he was at fault. It was at this point I decided to get to the bottom of his problem, at least to my satisfaction. And so I grabbed and installed Trolltech's Qt Windows Open Source 4.2.3 on Windows Server 2003 and Vista Home Premium.

Windows Server was installed by me on an Athlon 64-based Boxx at work. I installed the 32-bit version. Vista Home Premium came installed on my daughter's new Toshiba notebook, a Satellite A135. I installed Qt 4.2.3 on both, and in the process, installed MinGW. MinGW comes with gcc 3.4.2 and a minimal set of tools and libraries to build C and C++ applications on Windows. I use it in conjunction with Qt to build portable open GUI applications between Linux and Windows. I also used it to successfully build Nicely's sample C code. The results of the compile and run follow.

I ran the test first on Windows Server 2003. I unzipped the sources from Nicely's website and ran his pre-compiled binary. Sure enough, it failed as he indicated. I then recompiled his example code and re-ran the test again with my binary (vista2.exe). It succeeded, as you will note below.


I then borrowed my daughter's notebook long enough to install the same set of tools and run the same tests again. As before, Nicely's binary failed, and my built binary ran successfully.


I should note that even though I did not execute a directory listing, the size of the binary built was the same on both Windows Server 2003 and Vista (as it should be).

This just goes to reinforce my opinion that (1) Nicely is wrong in his assumption and that (2) he needs to use a better set of tools that what he has. I have used both Cygwin and MinGW on Windows, and for the cross-compilation and utility writing I perform, they've never given me trouble. I consider Nicely's problem solved (to my satisfaction). I think Nicely needs to upgrade his open source development environment and go find something else to complain about.

Tuesday, May 29, 2007

NetBeans Notes from the Field

I got my very own copy of Rich Client Programming at a local Barnes and Noble bookstore today. I purchased it for two reasons; to learn more about the NetBean's internals, especially with regard to reusing it as a foundation (or platform) for my own applications, and to help fund the effort and hopefully make it successful enough for more books like this in the future. Many will look at dead tree documentation as a complete waste. But the effort to gather and organize information for publication produces the most focused collection of accurate information you're going to read anywhere.

Over the years I've discovered that a bad book is far better than a 'good' collection of on-line documentation, especially if it's a computer book. There are thousands of years of refinements in how to publish books. The books we take for granted trace directly back to the fifteenth century and Gutenberg's invention of movable type. I can sit and read a book far more easily than any computer or 'e-book'. A dead-tree book doesn't need any power (other than me turning pages), it can be read in just about any light and any angle (except from the back), and just about any little slip of paper can be used as a bookmark. Even writing in the margins to make corrections or jot down thoughts is dead easy.

NetBeans 6 Regression

I tried to load and use NetBeans 6 Milestone 9 on rhea, the system with Ubuntu 7.04 and Beryl enabled. Unfortunately this version of NetBeans 6 will not work with Beryl enabled. The windows come up as white (no controls displayed). If I disable Beryl, then NetBeans 6 behaves correctly. Since rhea is more for cross checking and testing, not development, it's not that big a deal. But it is something to keep and eye on, and yet another reason not to enable Beryl on europa.

Another World Wind Java Post

Geertjan Wielenga, one of the three authors of "Rich Client Programming", has posted another entry about using the Java version of World Wind in NetBeans. This time he's gotten it integrated into the NetBeans Platform. I like what's happening because it shows what can be accomplished when two very different open project, both designed for reuse, are "mashed up" into something new and interesting. Geertjan and his friends over at Planet NetBeans are a great source of information, support, and ideas.

Monday, May 28, 2007

Kicking the tires on Nexenta

I can never leave anything alone, especially if it's an OS I've not played with before. I ordered a free copy of Sun's Open Solaris Starter Kit mid-May, and took some time this evening to boot its three included distributions; Nexenta, Belenix, and Schillix. There are better reviews of these distributions if you google for them (especially Nexenta), so there's nothing special or new I can disclose. I can, however, provide an additional perspective you may find useful.

The three distributions come on a single DVD, and when the DVD boots you're presented with a GRUB screen that allows you to pick between the three distributions, as well as variations within each. For example, there are six different ways to boot Nexenta. I chose the 32-bit version. I tried to boot all three, sticking with Nexenta for further reviewing because it operated the best of the three.

The system Nexenta booted on was europa. This is the machine that runs Open Suse 10.2. It took quite some time for Nexenta to completely boot on this machine, but I attribute that long boot time to its alpha code state. There were also several crashes of the window manager, but after the second login it came up and behaved quite stably.

As the screenshot shows above, Nexenta's window manager is Gnome. The Gnome version it currently ships with is 2.14.1. The desktop layout and the selection of items on the menus are nearly identical to Ubuntu. Because of that close similarity Ubuntu, I had no trouble locating applications (such as this browser) and perform some simple tasks (such as writing up this entry).

Here's a short list of what I discovered.
  • Video: As you can see, it found and used my ATI 9700 Pro at 1792 x 1344 at 75Hz. This is the best resolution and refresh rate out-of-the-box I've experienced with any live boot distribution. Ubuntu, for example, chooses a higher resolution at 60Hz, which drives my eyeballs up a wall.
  • Network: It found the Intel gigabit card on this system, then found and used the nVidia nForce 2 built-in ethernet. That was amazing in and of itself. My past experiences with older versions of Solaris x86, especially with networking, were painful at best. Nexenta Just Worked.
  • Sound: I tried to run some audio but it didn't work. It could be due to driver support or something missing within the Ubuntu userland tools that were not installed. No big deal.
  • USB: I tried to insert my thumb drive and use that to capture some screen shots, but it would not automatically mount it. I tried to look for it under /dev. but I couldn't tell if Solaris-based kernel found the USB stick as a device. Again, no big deal. I had enough of a RAM disk to capture and upload at least the screen capture you see above.
I'm intrigued with Open Solaris, and especially with Nexenta. I really like what I see and what I've touched so far, bugs not withstanding. I'm really looking further afield now with all the controversy surrounding the Novell/Microsoft deal, beyond Linux, beyond the BSDs, even beyond OS X. I'm also intrigued with Ian Murdock's hiring by Sun, and I look forward to his contributions to these distributions. I am a big Sun fan, first through Solaris and SPARC, then later through Java. I look forward to the next six months with Nexenta, Open Solaris, and Sun. Who knows, I may be running Nexenta in 2008. I could sure do a lot worse. I could run Vista.

Beryl Benchmark with a Gigabyte 7600 GS

There's an interesting article on Phoronix titled "GPUs & Beryl: What is Needed?" The article covers the video cards and associated driver combinations they've tested with Beryl, and it's quite the read. I wish it'd been available when I was working with Beryl and Compiz under Ubuntu 7.04. One feature brought out in the article was the Beryl Benchmark indicator, which the authors describe as "not an incredibly accurate benchmark." Nevertheless, it can provide a reasonable figure of merit when attempting to compare different hardware platforms and combinations with the same software. And I'm also posting this because they didn't test with any 7x generation video cards.


One minor note: to enable the benchmarking capability, bring up the Beryl Settings Manager, go to the Extras tab at the top, and enable the Benchmark plugin on the left side. Exit the Beryl Settings Manager. You then display the benchmark widget on the desktop with the key combination defined in the Beryl Settings Manager. The keyboard combination on my system included the Super key. The Super key is the Window key on my keyboard, and it may be yours as well.

As you can see above, my combination of hardware and software give a reading of 75 frames/second. I found this rate to be consistent, regardless of what applications I started or what activities I performed on the desktop (such as moving a window, or starting an application such as Google Earth).

My system setup is:
I'll throw in my two cents worth about Beryl and the eye-candy impact on video performance.
  1. Performance on older ATI-based hardware is horrible. The Gigabyte card replaced a 9600 SE video card (64-bit, 128MiB) in order to be able to run Beryl and Compiz at any decent rate above a snail's pace. And my other machine, europa, running a 9700 Pro, works just fine with the ATI drivers as long as I don't attempt to bring Beryl up.
  2. Performance on nVidia-based cards is the way you want to go. After installing the Gigabyte card the performance of the Beryl desktop has been silky smooth without artifacts or performance degradation. Again, I'm using the nVidia drivers, not the open source drivers.

Saturday, May 26, 2007

They're Here. And They're Nothing to Write Home About.

Remember that iconic scene from the start of Poltergeist, with Carol-Anne sitting in front of a T.V. screen full of static and making that announcement in her high, sweet voice? That odd little feeling of dread? You don't know why, but the warning flags are going up in your mind. That's the way I feel with the official arrival of Dell's Ubuntu systems.

Why should I feel dread? After all, this is what I've been waiting for for quite some time; a big official vendor to sell systems with Linux installed. Even my mixed experience with Ubuntu doesn't really bias me from not recommending the purchase of systems with Ubuntu installed, if the Ubuntu distribution has been properly installed and the hardware is fully supported.

Now that I've had a chance to look at Dell's initial offerings, I've come to the conclusion that two major assumptions of mine will go unfulfilled by Dell; wide hardware support and a rich Linux experience. Dell has chosen to sell three very limited Linux machines; two desktops and a notebook. They've deliberately hobbled the hardware choices you can make within the machines, and just like they promised, they're only shipping them with one distribution installed, Ubuntu.

Compare Dell's offering with Emperor Linux, a company that advertises within the pages of Linux Journal, and has done so for a number of years. Emperor advertises notebooks from manufacturers such as IBM (Thinkpad), Sony (Vaio), Fujistsu (Lifebook), and Dell. That's right, Dell. And the Dell systems they resell are Dell Latitude's and Inspirons that range in price from $1100 to over $5000. These machines can be configured with the latest Core 2 Duos, with up to 4GiB of DRAM, up to 160GiB hard drives, and the video can range up to 1900 x 1280 resolution driven by nVidia FX video cards. What does Dell offer for video? The low-end last-generation nVidia 7300 "budget" card. Period. And Emperor will install the distribution of you choice, including Ubuntu.

Dell's deliberately limited Ubuntu Linux offerings show cowardice in the face of Microsoft's displeasure, and telegraphs to the casual shopper that Linux is a very cheap (as in quality) second to Windows in terms of breadth and depth of hardware support, when it most certainly is not. If Emperor can sell you a portable powerhouse built around Dell notebooks with your choice of Linux distribution, then why can't Dell itself?

Anyone who thinks Dell's initial foray into the Linux PC market is a breakthrough, is a fool. They're not going to get my business nor my recommendation.

Wednesday, May 23, 2007

Atul666

There's a nice little blog called "SCO News Roundup" hosted (like mine) on Google/Blogspot. The author, atul666, does a right fine job running the place. His latest post (5/22) is about a new SCO' 'opinion piece' disguised as fact (as so many of SCO's latest submissions seem to be) about what Novell really intended to do when it signed the APA deal with Santa Cruz Operation (i.e. OldSCO). It's a hoot. The author of this legal fan fiction, G. Gervaise Davis III, seems to have a long and tangled history with prior versions of SCO stretching back to CP/M and Dr DOS. Head on over and read atul666's blog. You'll be glad you did.

Lies, damn lies, and statistics

There are three kinds of lies: lies, damned lies, and statistics.
Mark Twain attribution to Benjamin Disraeli
I don't normally head over to Groklaw, but I did recently based on some other links and came across an article about a study conducted by Microsoft on how developers don't want the GPL, version 3, to "police patents." If you dig a little deeper into the study, the following facts about the study are exposed:
  1. It was conducted by email.
  2. 354 emails were sent out between Feb 28 and April 4 2007.
  3. 332 reached their destination.
  4. 34, or slightly more than 10%, responded.
Now, PJ chose a particularly inflammatory (some might even go so far as to say trolling) title: "Only 11% of OS Targeted Programmers Willing to Help MS-Funded Study." And that's because Microsoft did indeed fund yet another study seeking to bolster their position with regards to the GPLv3. I don't see it quite that way. I see that only 10% of the targeted group (according to my simple math skills) bothered to respond (frankly, for reasons unknown). And those 34 bear the heavy responsibility of representing the rest of us who either refused to respond or who were never even contacted. Great responsibility indeed.

Do I feel pity that Microsoft's efforts were misconstrued yet again? Hell, no. Microsoft has a long history of lying, cheating, and outright theft in the accomplishment of its singular goal of total domination in all markets it wants to play in. Microsoft went looking yet again to buy more support for their position with regard to the GPLv3. And Microsoft got sloppy in deciding to run with this particular study. Every Microsoft 'study' been pretty dodgy to my eyes, but this one seems the worst by far. This particular study is based on only 10% of it's statistically significant sample group. And I'm assuming, based on my ancient statistics classes from some 30 years ago, that 332 (not 34) is the minimum population size required for the study to have any relevance. So if you only got a 10% response from your relevant group, how can you really draw any conclusions?

The one conclusion that I can personally draw is that those that did respond are the ones who have an issue with GPLv3. Which means that I can interpret the statistics to mean that out of 332 potential respondents 298 (90%) did not have a problem with GPLv3 and so did not bother to respond. I also wonder that when the initial request went out if the potential respondents were told that Microsoft was funding the study. Would more have responded if they knew? I would have just to make sure my opinion was at least recorded. And I believe many others would have as well. But if it went out as a low-key request to respond to these questions, then many would have read it and tossed it as one more thing to try to shove into an already busy schedule. And so it got dropped on the floor.

The PDF of the paper is here. I'm going to read it and see what the paper's authors really said, and read their own words about how they interpreted the results. I don't trust the eWeek article sited by Groklaw, and I've learned not to trust PJ's interpretation either. Just like I've learned to question just about everything Microsoft says.

Tuesday, May 22, 2007

Desktop Linux for general use will never succeed

There's a post on OSNews titled "Five Things the Linux Community Doesn't Get About Joe User", which is itself a link to another blog post "Five crucial things the Linux community doesn’t understand about the average computer user". I find it illuminating to read the comments from the OSNews link first, then go and read the original post.

Most of the comments divide into two camps; the "spot on" camp and the "we don't want no stinkin' Window's users" camp. The commenter's that don't want the unwashed Windows users are poorly written; I've discovered over time that the really vociferous Linux defenders are functionally illiterate and can barely communicate a defense (logical or otherwise) of their One True Love.

I'm now watching Dell getting ready to ship some of their machines with Ubuntu 7.04. The Dell blog entry has two bullet points at the tail end that are worth repeating here:
Software and Hardware Not Offered
  • For hardware options not offered with this release, we are working with the vendors of those devices to improve the maturity and stability of their associated Linux drivers. While this may not happen overnight, we do expect to have a broader range of hardware support with Linux over time.
  • At this time, we are not including any support for proprietary audio or video codecs that are not already distributed with Ubuntu 7.04. These include MPEG 1/2/3/4, WMA, WMV, DVD, Quicktime, etc. We are evaluating options for providing this support in the future.
These last two points from Dell underscore why Linux for the general user has been and continues to be a failure. Linux has never Just Worked after the install. A package manager has to be invoked, the bits have to be downloaded and installed, and after being warned about possible DMCA violations, you might (and I stress might) have the ability to play back audio and video content. And that's if you're lucky enough to download the correct bits. Make the mistake of getting the wrong ones or not enough of the right ones and you go through the same process again. Contrast that with Windows and Apple. People don't have to do that. They expect that when they buy the machine it's already set up to view and listen to digital content. They're not going to put up with it. Couple that with the sour attitude from too many elite Linux users and you've got an environment ripe for continued failure. The general user doesn't care about the OS any more. What they care about is what can they do with the hardware. The general user wants to use visual capabilities such as DVD playback, video streaming, and still image viewing and manipulation from digital cameras. The audio portion has pretty much been swallowed up by portable MP3 players that don't run Linux. They want appliances that just work, not free software politics that ironically limit their choices in the name of free and open source.

Linux is going to continue to remain a very small niche on the PC. Dell will make the grand effort to sell Ubuntu (and Suse on servers, which may turn out to be more sucessful), then drop it within the year due to very poor sales and high support costs. Then we'll have the vocal Linux zealots blogging and posting on forums about how Dell caved in and wasn't really serious about selling Linux, or how Dell chose the wrong distribution, or some other reason that doesn't focus on the real fact that Linux, as it's currently being designed and written, is inappropriate for the general desktop. It always has been, and it always will be.

Monday, May 14, 2007

Playing with Nasa's Java WorldWind on Suse

Geertjan has a really good first article on using Nasa's latest Java WorldWind client software wrapped up in a NetBean's project. Note that the client is a preview, but it's still quite usable to get started with. Geertjan looks to have used NetBeans 5.5. My project uses the following tools and environment:
  • NetBeans 6 Milestone 9
  • Java 1.6.0 Update 1
  • Suse 10.2
  • Latest ATI drivers (8.35.5 or later)
When I created my project I elected to start with the Java Desktop Application (New Project > General > Java Desktop Application). Most of what Geertjan specifies that you do is still correct under NB9, but there are some subtle differences which are easy to figure out. There is one (minor) issue to be aware of. When you want to drag and drop the WorldWindGLCanvas Bean onto the DemoFrame, you need to delete the central canvas that's already there. Once deleted, you then drag the WorldWind bean into its place.


This looks to be a really interesting tutorial series.

Sunday, May 13, 2007

Suse is back

Well, that lasted what, a whole two weeks? I went completely over to Ubuntu 7.04 on europa and discovered that there are still enough rough edges and sharp corners that I missed Suse's capabilities and stability.

The final straw for me was DVD playback. I have no idea what happened, but both Totem and VLC started to refuse to play certain DVDs correctly. For example, Totem would not play the menu in "Madagascar" but would instead immediately play the first title it found. VLC wouldn't even see it. And then there was K3B having problems ripping DVDs. I never had any of those problems under Suse 10.2, and certainly not when Ubuntu was first installed. So, out came the openSUSE 10.2 installation DVD and into the machine it went. In in less than an hour I had openSUSE 10.2 re-installed. I'm really disappointed in VLC under Ubuntu, but I despise Totem under any Gnome-based distribution. Thank goodness for Kaffeine (and a properly working K3B).

So what about all that bullshit I posted earlier? Fine words indeed when you think you've got your back covered and you can continue doing what you did (and expected) before you switched. Unfortunately I couldn't with Ubuntu. And I wasn't in the mood to learn the finer nuances of Ubuntu and uncover what was causing my problems, especially when I already knew how to make it work under openSUSE. Lots of skills transfer between distributions, and lots of skills don't. It's the ones that don't that cause you grief.

Do I still feel 'betrayed' by Novell? Maybe betrayed is too bombastic a word, but I certainly don't feel comfortable using Novell. I've got a constant eye on Novell and its behavior as well as the advances of other distributions. I have no great love for current Novell and I will certainly try this again when the next releases of Fedora and Ubuntu hit the streets. Microsoft is too close to Novell and I don't trust Microsoft.

I did make some changes when I re-installed openSUSE 10.2:
  1. I did not install Gnome side-by-side with KDE. I installed KDE by itself.
  2. I did not install Beagle. I went through and removed everything Beagle-related. The desktop and system runs a lot smoother and faster.
  3. I installed the latest ATI drivers for the 9700 Pro right after the full install.
  4. I went to here and set up all of my repositories (or the ones that really mattered), and stayed away from The Jem Reports "Hacking Suse 10.2". I'm holding my breath as I write this, but my repository issues have not come back up. Updates have been reasonably fast and easy.
  5. I did not install Beryl. I've got enough eye candy on europa.
openSUSE's back and it working the way I've come to expect. The future will take care of itself.

Monday, May 07, 2007

The AI revolution has been canceled due to lack of reality

(WARNING: Somewhere in the following chaotic post is a really good thesis struggling to be free. It may change drastically as I sort it all out.)

There's an interesting interview with Vernor Vinge on Computerworld about AI and how it will surpass human intelligence by 2020 ("AI will surpass human intelligence after 2020"). I certainly don't have Dr. Vinge's credentials. I have a lowly undergraduate degree in electrical engineering, having been damn glad to get it and to get out into the working world. But I do have nearly 40 years of hands-on experience with IT, and I can assure Dr. Vinge that as long as IT works as it does then we've got nothing to worry about with AI intelligence exceeding our own.

This isn't to cast aspersions on what's currently out there. Quite the contrary. There's a lot of very good systems and software, and some of it looks nearly miraculous and seems close to validating Clark's Third Law. But the bottom line is that it's still limited, and it requires at some point a human-in-the-middle to successfully operate. And it will continue to do so well past 2020.

The Problem with the Future

I am a child of the 50's, and grew up reading science fiction from the three grand old men of that period; Asimov, Clark, and Heinlein. Asimov gave me robots and a human galactic empire ("The Foundation Trilogy"), Clark gave me vistas of the near and far future ("The Sands of Mars", "The City and the Stars"), and Heinlein gave me gritty reality in the near future ("The Moon is a Harsh Mistress", "Starship Troopers"). In all those futures travel around the solar system via rocket was assumed, and it was further assumed that it was but mere decades away.

This all culminated with the book and movie by Clark, "2001". I remember seeing "2001" in 1968. It was a year later that Apollo 11 landed on the moon. Vision and fact seemed to be in lockstep. Then reality set in. Apollo continued up to 1972 and Apollo 17, when the program was canceled. As the Apollo program wound down many of the scientists and engineers who helped put us on the moon were laid off. We had a laughable attempt at an orbiting space station called Skylab. Money was being funneled towards the then-new Space Shuttle, which over the years was stripped down in capability into the system we have today. All we wound up doing after landing on the moon in 1969 was to spend an inordinate amount of money orbiting the earth. We've forgotten so much that we're struggling to re-discover this lost treasure (such as in junk yards!) just to go back to the moon by 2020, let alone beyond it.

The problem with the future is that when it finally arrives it never looks anything like you originally envisioned. It turns out to be a lot harder and a lot more expensive than you ever anticipated, and it can be a real disillusionment and a powerful cynic generator.

Computers Are Dumb

Computers are fragile. Take away their power and they sit there like big dumb door stops. Remove their network connectivity and they consume inordinate amounts of power doing very little of real interest. Connect them up and they are the targets of computer virii that want to turn them into bot networks for the purpose of sending spam and propagating large-scale DDoS attacks.

The idea that our computers are going to grow every more sophisticated, every more faster, ever more connected, than then hit some critical threshold and become equal or superior to us follows the same flawed thinking (and wishing) that we used to follow with regards to ubiquitous space flight. In fact we'll have affordable space flight long before we'll have artificial intelligence, given that the intricate engineering required to create reusable and affordable lifters are just now being built. We're going to need to come up with a fundamentally different approach to artificial intelligence before we can even begin to design them, and no, quantum computing ain't the answer either.

People Are Dumb

We have a very bad habit of falling in love with our technology, of elevating it to heights it does not deserve. Our current need to elevate our tools appears little different from our desires, thousands of years ago, to create idols from gold and to erect temples in which to place them and worship them. Computers should be amplifiers of human intellect and capability, not replacements for it.

The drive to create an AI seems symptomatic of a much deeper problem with America: we've given up and turned within. It turned out to be very difficult to push out into space, especially the way we wanted to in the 1960's. The solutions for affordable and sustainable rocket flight are a lot harder than we ever realized and our own politics are even more intractable, especially when you're trying to sell manned space flight to people who want to do little more than sit mindlessly in front of a television or twitch mindlessly in front of a video game.

And so, in disappointment, we've turn inward to create a world that will satisfy our need to succeed. We have our computers and the software we write to run upon them. And these idiot savants do many clever things and do them so much more faster than we can, and we make the mistake believing they're better than us because of it. But a faster idiot is still an idiot. Speed kills, especially if you're an idiot.

We may yet create artificial intelligence. Given enough time anything is possible. But to say that we'll have an AI that exceeds our intelligence by 2020, a mere 13 years away, is a fool's prediction at best. And to say it will occur after 2020 is disingenuous; even I can make such a prediction, along with space colonies on other planets and faster-than-light flight to the stars. I just won't say how much beyond 2020.

If we want better intelligence (because in the end this is what the search for AI seems to me to be what we're after), then I'd suggest we nurture it within ourselves; that we tackle our own limitations, both intellectual and ethical. There's the real challenge and the real payoff.

Sunday, May 06, 2007

No more Suse

Why did I install Ubuntu over Suse? The short answer is Novell's cross-licensing deal with Microsoft. The longer answer is watching Novell's behavior in the marketplace over the last two years with Suse. The much longer answer is that Ubuntu, and to a lesser extent Fedora Core (with RHEL), are what I can use and depend on in the future as an alternative to Microsoft. I don't need or want Suse any more. Novell has ruined Suse for me.

I want a free machine; free to do what I want with it, not free-as-in-beer. Hardware costs, and so does good software at times. If it means giving up 'consuming content' because a free OS won't support tomorrows DRM, then so be it. I have what I want in the form of MP3's I've ripped from my CD collection, and I've started to rip all my favorite movies, especially the ones I find in the Walmart $5 bin. If it ever reaches the point where it's all out on HD-DVD or BlueRay and I can't see it on a free-as-in-speech OS, then I hope I'll have enough older tech squirreled away that I can continue to view and enjoy all of that content I've saved up over the years.

I've purchased every version of Suse I've ever installed, starting with version 7.3 and all the way up to 10.2. And I mean every single release. I've always spoken highly of Suse based on my own personal experiences. But I won't spend another dime, and I will never again recommend Suse, either informally to my friends or formally where I work. I feel betrayed by Novell, and I will not reward that betrayal.

Spider-Man 3

So I waited until today (Sunday) to head on over to the theater, and I hit the 9:45 A.M. (yes, that early in the morning) to be one of about a dozen movie-goers to see the latest Spider-Man. When I got to the theater I was literally the only one in line that early. The lone guy selling tickets told me that the big crowds had hit Friday ("It was a mad house"). He said that it was that empty Saturday as well as this morning. I don't think that bodes well for continuing ticket sales. I think sales are going to crash pretty hard by next weekend, with our without another blockbuster release.

So I got in and picked one of the best locations in the theater, with lots of elbow room. I looked around and counted 11 other people scattered around an area capable of seating 400. I pulled out my cell phone and played Snakes while the ads and previews played, then put it away to watch the movie. I hate wasting 15-20 minutes sitting through that crap.

I have to say that the movie was quite frenetic, what with Spider-Man fighting Harry (The New Goblin), Venom, and Sandman, with a grand finale at the end pitting Venom and Sandman against Spidey and Goblin, with Mary Jane caught in the middle. The themes throughout the movie were about darker sides and what really motivates us to do what we do. And Pete finally forgives the man who killed his Uncle Ben and in the process brings that part of his life to a close.

Best part? Watching Bruce Campbell play a French maître de (with a flawless accent) at a swanky New York restaurant. This marks Bruce's third appearance in the Spider-Man movie franchise where he's played a pivotal role. If it wasn't for Bruce I think the movies would have been critical flops.

Like I said, lots of villains, sub-plots, and action. In short, like a big movie comic book. Should you see it? Depends. I liked it, but what do I know. It was a damn sight better than the last three Star Wars movies. In case you missed it Spidey 3 raked in $148 million this weekend in spite of nearly every critic's attempt to trash it.

It'll be interesting to see if Sony makes any more Spider-Man movies. It will also be interesting to see if any other summer 2007 blockbusters make as much money; I'm thinking Pirates, Fantastic Four, and Harry Potter in particular. Who knows. This may be one hell of a summer for movies.

Saturday, May 05, 2007

NetBeans and Eclipse release new milestones

Friday was an interesting day. NetBeans released NetBeans 6 Milestone 9 (NB6M9), and Eclipse released Eclipse 3.3 Milestone 7 (E3.3M7). It should be noted that NB6M9 is now considered a general preview release, just in time for Java One. I downloaded and installed both. The changes provided in NB6M9 are somewhat more significant than in E3.3M7. What follows are some notes and impressions from initially installing NB6M9.

There are now three packages (or bundles) for NB6; Basic, Standard, and Full. With Basic you get just the IDE (Java SE development, GUI Builder, and the Profiler). With Full you get Basic plus JavaEE, Mobility, UML, SOA, Ruby, and Sun's Application Server. This is a fundamental change (and a break) with earlier versions, where you downloaded the equivalent of Basic and then used the Update Center (Tools | Update Center) to pick up other functionality, such as UML or Ruby support, or you went to one of the supporting projects and downloaded that package (such as the Profiler or Mobility). Update Center is gone starting with this milestone, so if you want any of those other capabilities then my advice is to download Full and tailor your install.

I downloaded and installed NB6M9 under Ubuntu 7.04. I also have Java 1.6.0 Update 1 installed under Ubuntu. The 1.6.0u1 version of Java is significant in that NetBeans 6 uses the Gnome look-and-feel out of the box as its default look-and-feel, even with regards to the installer.


One of the first dialogs to be displayed by the installer (a standard shell (sh) script) is a list of all the features available with the package. You tailor this with the 'Customize' button at the lower left. When clicked, the Customize button presents a dialog that allows you to choose what features you want installed.

As you can see I've deselected a number of features because I don't need them. I have no idea what would happen if I decided to add those features in the future. Would I re-run the shell script and add them back in? It's an experiment I need to perform. One problem I ran into with the customizer was the right side feature description panel.

As you can see above, it disappears under certain circumstances. In fact, there was considerable flickering in that panel along with text that seemed to be flowing across the panel. I could not widen the dialog enough to widen the panel, so I don't know if some sort of error was being displayed in the panel. But it didn't stop the customizer from working or interrupt the installation.

After a successful installation I went looking for the Update Center, and found it replaced (along with the module manager) by the Plugin Manager (Tools | Plugins).

The Plugin Manager is a significant and very welcome change over the earlier feature and module management tools in NetBeans. Everything is now in one location and far better organized. Because major features are now part of the download bundle the number of installable modules has decreased dramatically, which is also a welcome change. And finally (finally!) there is a decent description about the modules themselves. Plugin Manager is very polished and professional looking.

Once a module is installed it's a simple matter to find it and either deactivate it or uninstall it.

I have yet to try out existing features in NB6M9. I had stopped testing the milestones, specifically with milestone 8, because I'd reached a point where updates had rendered them useless. I look forward to working with NetBeans 6 milestones again, and hope this time that updates don't render it inoperative and force me to remove it.

Thursday, May 03, 2007

09 f9 11 02 9d 74 e3 5b d8 41 56 c5 63 56 88 c0


Dear AACS,

Fsck you. And yes, I do know how to use this.

Sincerely

Update

New AACS cracks cannot be revoked, says hacker

Only a few days after Corel issued a WinDVD update to close the hole opened by AACS hackers, the folks at the Doom9 forums sent word that they have found yet another way around the copy protection for high definition discs. This time, the method involved the Xbox 360's HD DVD add-on drive to capture the "Volume Unique Keys" as they were being read by the drive itself. Rather than just point out the crack, we're going to take a closer look at how this crack was accomplished, because one of the hackers involved in the crack says that it's more or less unstoppable.

The latest attack vector bypasses the encryption performed by the Device Keys—the same keys that were revoked by the WinDVD update—and the so-called "Host Private Key," which as yet has not been found. This was accomplished by de-soldering the HD DVD drive's firmware chip, reading its contents, and then patching it. Once that was done, the firmware was soldered back onto the drive.

Despite the technical difficulty of performing this hack, it does offer some advantages in the race to beat AACS copy protection. "They cannot revoke this hack," said forum member arnezami, who has been at the center of much of the AACS cracking recently. "No matter how many Private Host Keys they revoke we will still be able to get Volume IDs using patched xbox 360 HD DVD drives."

Get your XBox 360 HD-DVD player at Newegg.