The Future of Computing

Last month I had the pleasure of being at Intel’s Microprocessor Research Labs (MRL) in Santa Clara for an Open House they were holding for the press and internal staff. MRL does research in technology likely to hit “the street” (retail channel) anywhere from three to eight years into the future. What was being shown was very impressive, to say the least.

To start off, MRL is working very hard to make sure Moore’s law continues its exponential march. Shown was work the lab is doing, in partnership with other chip manufacturers, on “extreme ultraviolet” lithography. Taking over from traditional visible light techniques, this will lead to sub 0.1 micron components. At the same time, 300 mm wafers will mean ever more chips will be possible from a single slice of silicon.

OK, so we’re going to continue to have ever more processing power, more than most people know what to do with even now. What do you do when you can’t buy a machine slower than a mainframe of only a few years ago? The MRL have some suggestions, and as you might imagine, they’re rather computationally intensive.

The “Office of the Future” (OotF) demo is a good example. Within a corner office cubical, two high-resolution video projectors are arranged such that their displays show up adjacent to each other on the cubical walls. A third projector is mounted above, and projects down onto the desktop. A single computer drives all three displays resulting in a 3D workspace; windows can be moved between them as desired.

Now, things get interesting. Instead of a mouse, a special 3D camera watches the user and their hand gestures. Computer vision software (this is where the heavy lifting for the CPU comes in) takes the camera data, recognizes what the user is doing, and translates this into instructions for the computer; scroll up, move window left, etc.

For an added twist, for the open house Intel had two such set-ups. They were linked together by way of pin-hole video cameras and microphones, hidden behind a tiny hole in the middle of the wall-displays. Thus, the two OotF users could video-conference between themselves in a very natural seeming way — you could tell if the other person was looking at you or someone beside you.

Another demonstration was the “Voice Portal”, which could understand spoken words from a user without prior training. It has a very large vocabulary, and can handle continuous speech (instead of each word having to be spoken distinctly.) With the dreams of a “Star Trek” like interface one step closer, the more immediate uses include voice interaction for automated telephony solutions and speech-to-text applications.

Multi-modal input is likely to become more common as CPU power allows more analysis to take place on the input data streams. To encourage such applications, Intel announced at the open house the availability of the Linux version of their Open Source Computer Vision Library. This library is intended to be a “substrate” upon which both CV research and commercial applications be developed.

Also announced was the Open Runtime Platform, which is an open source framework for building run-time environments, taking care of memory management, garbage collection and linking issues. Although not of any use to consumers, it’s of huge interest to geeky developers. The technology will work it’s way down to the street, embedded in other products.

Intel have always been a company to watch, being the leader in microprocessors for many years. Although AMD can now claim to sell the fastest x86 processor, and with startups like Transmeta nipping and their heals, the research done by Intel’s labs will continue to influence what we see in the marketplace for years to come.

Published in the Victoria Business Examiner.

Step off the Upgrade Treadmill.

Increasingly, the questions “Just how much money are we spending on desktop computing software?” and “Could this cost center be better managed?” are being asked. It seems there’s a new upgrade every year or so, at great expense, for operating system and applications both. Why is this necessary? Where is the benefit?

For some reason, the PC culture has developed the attitude that if you’re not running the latest version of everything, you’re doing something wrong or are too cash poor to be able to afford the latest. No matter that what is currently deployed may be doing the job required of it just fine. No consideration that maybe your workers really are more productive without any helpful suggestions from “Clippy”.

But most feel you just must upgrade, or you’ll no longer be compatible with everyone else. It’s called the upgrade treadmill. A new version of a software package is released which has several bug fixes, a few interesting new features, and, oh, by the way, old versions aren’t going to be able to read the new version’s files. Thus, through network effects, you’re forced to upgrade in order to read others’ files.

It’s ludicrous. And it’s about to change, this time in the office suite arena.

StarOffice is a very capable set of programs, including a word processor (Writer), spreadsheet (Calc), presentation application (Impress), a vector drawing tool (Draw), a database management tool and a PIM scheduler, not to mention e-mail, news group and web client functions. Scripting (automating) is done using Java or JavaScript. StarOffice is available for Windows, Linux and Solaris, and is able to read and write complex Microsoft Word and Excel documents.

Last year, Sun Microsystems purchased StarOffice, and made it available for free download to anyone; commercial use is OK. On October 13th, the source code to the entire package will be released under an Open Source License (actually, two; the LGPL and the SISSL). Full details are at (URL: www.openoffice.com ).

What does this mean? Well, anyone who wants a full featured office suite can go to the Open Office site, follow the links, and download one for free. It’s a big download, but it won’t expire, and it can read and write files compatible with people using the latest versions of many products. Sun has just taken $850 out Microsoft’s pocket, and given it back to the consumer.

The Open Source side of things means development on StarOffice will quickly accelerate, likely to match that of other Open efforts like Linux, Samba and Apache. The Gnome Foundation, formed last month, (URL: www.gnome.org ), is going to take the StarOffice code and integrate it with the Gnome Desktop, making it more modular and adding CORBA interfaces. Sun are also providing 50 engineers to contribute to the effort full time.

To most consumers, this move by Sun is a potentially huge win. But not to all: it will be interesting to see what Corel does with its own WordPerfect product in light of this development. At the very least, Corel will want to contribute file filters, as StarOffice does not currently read WordPerfect files. Corel may decide to Open Source WordPerfect as well, but only time will tell if they have that much foresight and concern for their users.

In the longer term, there’s a shift towards storing data in XML format files. Users must be aware however that XML is not a data interchange panacea, and vendors can continue to refuse to document the formats fully. XML is just a framework for schema, and can be as abused and made as incompatible as binary formats.

For StarOffice users, because of responsive development, new filters will appear quickly as needed to read newly incompatible formats from other venders. For other product users, start asking the vender why the incompatibility exists? What benefit does it provide YOU as the consumer, over they, the vender. Just who’s paying whom?

Or maybe, just maybe, step off the treadmill. It’s free!

Published in the Victoria Business Examiner.