February 2001 Column


[ Site Index] [ Linux Index] [ Feedback ]


On getting it right (and wrong)

Back in Shopper 144 (on sale in January 2000) I took a stab at predicting what the next year would hold for Linux. Of course, I got a lot of things wrong; however, I knew I was providing hostages to fortune when I made those predictions, so I'm not ashamed to stand behind them. ('Sides which, I got one or two things right.)

The interesting thing is that, over the medium term, the computer industry as a whole, is very predictable. It's like providing a weather forecast for the Sahara; "very hot tomorrow, cold at night, rainfall about as likely as the Pope announcing his conversion to Islam." We know damn well that for the next twelve months Cisco will continue to dominate the router market (which is growing as more companies move towards b2b and b2c sales), that Intel will continue to dominate the microprocessor market, and that Microsoft will make money hand over fist while paying the lawyers to fend off the anti-trust suit via a long-drawn-out series of appeals. So why is the Linux field so unpredictable that someone with his eye on the ball can screw up radically when forecasting what the sector will look like in 12 months time?

First, a look at what I predicted in Shopper 144. "Linux is fragmenting into three or more distinct segments, and these are going to be the scene of vicious fighting between companies who see them as useful markets" -- well yes, and the sky is blue. The first market sector I fingered was small to medium sized departmental servers, and I predicted that Red Hat would lead, despite heavy competition from SuSE and TurboLinux. (So far, so good -- that's actually pretty close to what's been happening.)

The second sector is the desktop; I predicted that Corel would make inroads in the sub-500 pound market and that there'd be heavy competition in the office suite field -- "I'd therefore be unsurprised to see Applix bought by one of the bigger distributors within the next year and their core product given away for free in an attempt to expand in the critical desktop segment, much as StarDivision was bought by Sun in an attempt to provide a solution to the MS Office requirement for their users". What I got wrong here was the assumption that Corel would survive the transition to the Linux sector (they nearly didn't), and missing the radical possibility that Sun might open source StarOffice and Applix might retreat into the ASP field. This kind of stuff is difficult to predict accurately because it's hard to separate commercial propaganda ("Foo dot com announced today that they are going to sell their foo processing applications on Linux for thirty bucks, with a fifty dollar cashback offer, in an attempt to gain market share and make up their losses by increasing sales!") from commercial fact ("Foo dot com is nearly bankrupt and has been driven out of the foo market by the announcement of Microsoft's MicroFoo product (not yet released); they don't have the money to survive the transition").

The third market sector I stuck a pin into was embedded systems. I said, "Most of the embedded systems based on Linux through 2000 will probably be thin servers; things like the Cobalt Qube or Rebel.com's NetWinder. But I expect some radical surprises, like the cellular data-enabled touch-screen device Nokia demonstrated in 1999. It is not beyond the realm of possibility for Linux to mount a serious challenge to Symbian in the cellphone operating system market, at some time in the future."

Well, Cobalt was bought by Sun for a couple of billion dollars, because their low-end servers offered a way for Sun to get into that thin server market. And embedded Linux has taken off faster than a greased whippet. So I'll claim that one as a predictive success, sort-of -- although it required about as much foresight as not walking into a brick wall.

Anyway, so much for corporate wibble. What were the mistakes?

I predicted that KDE 2.0 would be out by early summer 2000. That Mozilla would also be out by summer 2000. That kernel 2.4 would ship before the end of the first quarter, and that Borland Delphi (aka project Kylix) would show up in 2000.

Every last one of those predictions was over-optimistic. Kylix has definitely been demonstrated, and is more or less in beta; but KDE 2.0 hung fire until November 2000, Mozilla was barely up to 0.6 as of the end of the year, and Kernel 2.4 is still slouching towards an uncertain millennium by way of interminable point releases. It's as if the cool, free air of the open source world has mysteriously been replaced with treacle!

This actually highlights an interesting point; the progress of an open source project is not dependent on marketing milestones. Companies that make their living by selling products are stuck on a constant treadmill of releases, because once their product has been on sale for a while it will begin to look increasingly old, and in addition the early adopters will already have bought their copies. In the commercial software market, old software is often seen as dead software: customers (who want to feel that they're using a product that is supported) get uneasy and look elsewhere.

There are other, more exploitative reasons for commercial software to be re-released frequently; Microsoft's churn of office file formats springs to mind. (If you're in business you need to be able to read the latest versions of the Excel and Word file formats; if Microsoft release a new version, enough of your customers or suppliers are likely to buy it to ensure that you've got to jump on the bandwagon so you can continue to exchange files with them. This is an example of file formats being exploited to force users to pay out for regular upgrades -- and it also keeps competitors at arms' length because their products can no longer read the 'standard' file formats.)

The open source sector simply doesn't play by these rules. The eighteen month release cycle is an alien intrusion; instead we get the "when it's ready" release cycle (which makes for more robust software but slower time for new releases). And this is where my predictive screw-ups came in: in every case I was wildly optimistic about the time scale of an open source development, forgetting that the real impetus for version upgrades comes from commercial considerations.

Finally, there's the absolute howler of an industry trend that I missed -- IBM's bid to get out from under Microsoft's thumb. How soon we forget: until 1994, IBM was the biggest shark in the aquarium that is the computer industry. Then it all went horribly wrong -- or rather, a whole bunch of mistakes came home to haunt Big Blue, culminating in a one-time loss larger than the GDP of most countries and a massed wailing and gnashing of teeth from the inmates of corporate data centres who had run them for the past three decades on the basis of the maxim that "nobody ever got fired for buying IBM".

IBM's initial mistake was to lose track of the PC market, resulting in the divisive (and ultimately futile) attempt to grab control back by way of the MCA bus, PS/2 machines, and OS/2 operating system. Their subsequent mistake, which compounded this, was to drop the ball on convergence, building an infrastructure that would allow all their machines (from PCs to mainframes) to talk the same protocols, use the same user interface, and run compatible applications. When Microsoft picked up their ball (OS/2 release 3, renamed "Windows NT") and walked, the massed DP staff followed -- Microsoft became the new IBM.

Until 1999, IBM seemed to have been ignoring Linux. But Linux is very good at creeping into the corporate undergrowth. By the end of 2000, Linux had begun to surface in a key role at IBM -- as one of the two keystones of IBM's bid to achieve total convergence between all their systems (with one operating system running on everything, and one binary executable format -- Java -- running on every machine in their range). You can run Linux on a Series 390 mainframe; you can recompile a Linux app trivially easily on an RS6000 minicomputer running AIX5L (the "L" is for "Linux compatible"), you can buy Linux on a Thinkpad laptop or a NetFinity server. There're even rumours that Linux for AS/400 will surface before long. IBM is pushing Java, and has open sourced a Java compiler and runtime engine. The writing on the wall is clear: IBM is gearing up to achieve a long-term goal that failed in the early 1990s, and if IBM succeeds it will be able to offer a solution to every DP problem (and put a chicken in every consultant's pot).

There are other rumblings. Michael Dell standing up on a platform to give a keynote speech at a Linux expo was one of them; inconceivable in 1999, it turned into reality in 2000. Linux is becoming an item of interest to the mainstream computer industry, although open source projects are slower to deliver results than was once expected.

So what do I predict for the coming year?

More progress in the server arena. Possibly a 1.0 release of WINE, the Windows emulation environment (which now seems robust enough that attempts are being made to splice it into the KDE and GNOME desktops transparently). Considerable growth in the embedded sector is likely, and a number of PC manufacturers will be shipping Linux on their machines. Compaq are moving to Linux as a strategic platform for their high-end Alpha-based equipment, and we can expect to see Linux make big in-roads into the commercial UNIX field, displacing just about everthing expect Solaris and very high end servers. It's also going to eat a lot of low-end Netware servers unless Novell do something about their shrinking market share.

There's a slow but visible trend for some closed source software to go open source; hopefully this will continue as developers discover that they can still make money off support contracts and documentation.

The biggest threat to Linux uptake, Microsoft's .NET infrastructure, has yet to materialise -- a number of different sets of ASP services are promised, and will be sold on a subscription basis, and it's possible that Microsoft will ultimately decide to make Linux a supported client platform (so you could use, say, Office.NET, on Linux -- but your data and the application server would sit on a Windows 2000 machine). That's their vision of the future, and there's no sign of them changing any time soon.

As for Apple ... we see a company selling technically elegant, cube-shaped computers. The computers run BSD UNIX with a cute graphical user interface on top. The CEO of this company is Steve Jobs. This is enough to cause serious deja vu, for a decade ago exactly the same sight was on display -- only the company was NeXT. The really interesting question for 2001 is how MacOS X will stack up against Linux. Is it possible that by the end of 2001 Linux will be in the #2 operating system slot, overall?

Staying SANE

I was originally going to devote the balance of this column to discussing scanner support on Linux. Linux is capable of importing images from a variety of capture devices, including flatbed scanners and digital cameras; there's a general architecture for doing this called SANE (short for Scanner Access Now Easy) but fate is conspiring to stop me writing about it. First, a network outage at Telewest stopped me getting to the SANE home page; then ... well, this photograph:

fucked scanner

demonstrates exactly why it is a good idea to unlock your scanner before trying to capture an image.

In self-defense, I should like to note that it is a good idea for scanner manufacturers to (a) indicate in the manual that unlocking the scanner before use is a Good Idea, (b) put the lock somewhere prominent rather than disguising it as part of the case or hiding it under the scanner lid's hinge (and omitting it from the diagrams in the manual), and (c) put a switch in the bloody thing so that the scanner won't tear itself to pieces if some baffled old hack, confused by modern technology, ignores all the portents and presses the "capture" button anyway. But I can quite understand why this is not the case, as such precautions might result in a downturn in sales of scanners to replace those that self-destruct messily on first use.

I then attempted to salvage the article by appropriating my partner's scanner; but she's in the throes of magazine production this week, and put forward such persuasive arguments (backed up with a six D-cell Maglite, a nail gun, and various muttered threats) that I think maybe I'll leave SANE for another month.

Remember, folks: gadgets with springs, bits of wire, and electric motors are not your friends. Especially when they self-destruct messily on first use, two days before a deadline ...

Spinning a yarn

This leads me on to one of my perennial rant topics: web design. Web design -- the design of pages for the world wide web -- may not sound like a topic that belongs in a Linux column. But it does; because there's a lot of really bad web design out there, and this is particularly obvious if you surf the web using a non-standard browser (that is: anything other than Microsoft Internet Exploder 5.5 on Windows ME).

In the beginning there was Tim Berners-Lee, at CERN, with a whole raft of different types of computer. And Tim wanted to get data from the VAXen to the Ultrix boxes and thence to the Apollo workstations running DomainOS, and to his NeXTstation. So Tim invented a lowest common denominator protocol, and a simple text markup language, that could be displayed on just about anything more complicated than an etch-a-sketch and transmitted between any hosts more advanced than two tin cans and a wet piece of string.

Over the course of the following months, the web was discovered, and turned into something commercial (God only knows why). And two companies (both staffed by very bright people who should have known better) got into a war for dominance of the desktop; and their web browsers succumbed to what Larry Wall, inventor of Perl, calls "feeping creaturism", and most of us call "creeping featurism". Around the time some enterprising souls were experimenting with hooking Netscape up to control a coffee machine via JavaScript, some other folks at companies that should remain moderately nameless (such as Adobe or MacroMedia) decided it would be a good idea to plug their favourite publishing and multimedia systems into the web.

Which is why the web, today, is a squiggly, mind-numbing mess of Lovecraftian proportions. HTML has been bent, spindled, and mutilated into a variety of shapes and sizes; Acrobat (PDF) and Shockwave presentations clog up corporate sites, and perfectly sane documents are rendered unnavigable by the replacement of straightforward hyperlinks with buggy JavaScript.

The commonest idiocy of the web designer is to assume that every computer will display a document the same way it appears on their own. This assumption is more or less accurate for PDF, which is a page layout format -- but HTML wasn't designed for that. The result is that graphic designers try to transfer their skills from paper to a new medium and make no allowances for, for example, systems that have graphical browsers, capable of displaying HTML 4.0 entities and processing JavaScript or CSS1/2 style sheets, but which fundamentally display a different number of dots per inch on the screen. Windows is a non-WYSIWYG platform to begin with; it displays screen fonts scaled up by 30% for legibility (a hold-over from the original VGA display adapter). Thus, many web designers scale down their fonts, to the point where they're unreadable on a non-Windows platform.

Another common problem is the assumption that only people with one particular browser and operating system count. I've run across many web sites that display a simple message: "this site is best viewed using Netscape 4 or Microsoft Internet Explorer 4 -- click here to upgrade". This is particularly galling when I'm actually using Netscape 4 -- the sites are so badly designed that they can't recognize a Netscape-compatible browser running on a different operating system, and make no attempt to degrade gracefully on less fully featured browsers. Naming no names, Dixons PLC are one serious offender; another is Psion. This is as customer-friendly, in e-commerce terms, as a retail establishment announcing a dress code at the door -- you can't come in and shop unless you're wearing the clothing they expect to see you in.

And then there are those egregious multimedia objects. I'm going to single out Flash (MacroMedia's web-based multimedia presentation system) for specific abuse, but all these criticisms apply to any attempt at using the web for multimedia. As Jakob Nielsen remarks, "About 99% of the time, the presence of Flash on a website constitutes a usability disease. ... Flash tends to degrade websites for three reasons: it encourages design abuse, it breaks with the Web's fundamental interaction principles, and it distracts attention from the site's core value." Nielsen should know; not only is he a design guru, but he's one who has studied -- and understands -- the web in a way that most designers don't. (You can find out more about him here.)

If anything, Neilsen understates the case against Flash. Firstly, Flash is useless to the visually handicapped. Worse, sites that use it usually don't bother providing navigation alternatives for users who don't have elaborate graphics systems or who are physically incapable of using such systems; a willingness to use Flash seems to be symptomatic of the idea that the web is a purely visual medium.

(But the web isn't used only by human beings with good colour vision. It's used by blind folks, and it's used by spiders and robots. If your content is Flash-ified, how's it going to get indexed? And if it isn't indexed, how will your readers find it?)

Secondly, there's bandwidth. I've never known a Flash-enabled website to use a Flash object that was smaller than 100Kb in size. While cable modems and DSL might be gaining ground in the EU and USA, they're by no means standard yet -- and it'll be a long time before the biggest overseas audiences (China, India, South America, etc) have this sort of bandwidth. As there are more English speakers in China than in the UK, this is not an audience to sneer at.

In a nutshell, much current commercial web design conspires to alienate or exclude the non-Microsoft user as much as the visually handicapped. This is partly due to clueless design staff, and partly due to clueless management (who equate lots of glitz with lots of work, which in turn deserves lots of the money they're forking out to pay for it).

Luckily, the tide is due to turn in 2001. WAP is a medium that makes a non-graphical web browser like Lynx look like the ultimate on feature-ridden bloatware; if you want your website to be usable on a mobile device (be it a Palm Pilot or a WAP phone) you need to adopt a minimalist approach, and this will ultimately force the extreme design school of website architecture into retreat. Aiding it will be requirements for web sites to be accessible to the disabled -- this already accounts for lifts and ramps in shops, and it should equally well require e-commerce websites to provide support for non-graphical navigation and browsers used by the blind. All of this is good news for Linux, as the excuse "but it can't browse the web properly" (which is a half-truth at best) will hold even less water than it does today.


[ Site Index] [ Linux Index] [ Feedback ]