Look out for Windows XP

The great Beast of Redmond is at it again, throwing their monopoly weight around. (Raise your hand if you’re surprised.)

The latest, greatest, gotta-have-it version of Windows, XP, is on the way, due end of October. Beta releases of the new system have been around for a while now, and include a new Internet Explorer, version 6. This is in parallel with the new release of Office XP.

Some of the XP’s new abilities are starting to raise some eye-brows. Take, for example, “Smart Tags”. These are hyperlinks which show up automatically in documents being created in Office XP applications — placing the mouse over the word pops up a list of associated Tag. This list, of course, is controlled by Microsoft.

This wouldn’t be a problem, of course, except that this feature is also being evaluated for inclusion for IE6. For web authors, this is a scary proposition — every web-page suddenly becomes full of links off to Microsoft or its partner’s sites and products.

The ramifications of this are wide, and there’s been almost universal negative reaction to the feature. Web authors are horrified that their (copyrighted) content will be modified in this way. There will be a META tag which will disable the feature, but many authors are discussing bring a lawsuit for copyright infringement instead.

Cookies are also an area which has received attention in the new IE6. Implementing the new Platform for Privacy Preferences (P3P) protocol, only cookies which implement a machine readable “Compact Policy” (CP) and are acceptable to “Microsoft’s internally set standards” are allowed by default.

To no-one’s great surprise, this means competitors to Microsoft’s MSN and other online properties, like advertising services from DoubleClick, find their cookies rejected. DoubleClick claim that their cookies will be working by the time the final release of XP is out — it will be interesting to see just how many companies are able to “make friends” with IE6, and thus survive.

Another new “feature” in Windows XP is the tying of the Windows installation to a particular computer, with mandatory network registration to ensure it isn’t pirated. This has raised privacy concerns, as it means Microsoft know a lot more about you, and it also makes things like motherboard swap-outs much more difficult.

And, speaking of motherboards, it would probably be a good idea to perform an upgrade of any computer where Windows XP is destined. The minimum recommended system is a Pentium III with 64 megs of RAM, although realistically you’ll need 256 megs and a fast processor to be comfortable.

Oh, one last thing: say goodbye to MP3s — Windows XP won’t play or record them by default, and adding third-party to do so has been made quite difficult. The technology presented instead, of course, is Microsofts Media Player. To hell with what the users want; Microsoft is delivering what the media companies demand: copy-protected content.

Looking at this latest release of the Windows platform, you’d think the antitrust lawsuit against Microsoft had never happened. And, in fact, it is generally assumed that the US Appellate Court will be reversing, or seriously limiting, Judge Jackson’s decision against Microsoft any day now.

Competitors continue to lobby both federal and state governments, both in the US and abroad, pointing out the never ending anti-competitive actions of Microsoft. Even if the Bush administration chooses not to appeal to the Supreme Court, the state Attorneys General currently involved with the MS trial have vowed to continue.

When will this all be settled? Obviously, not for years to come. As the slow wheels of justice turn, the range of consumers’ options for mainstream solutions continue to shrink. The good news is consumers are starting to realize that they don’t necessarily need another upgrade, so there’s some question as to how quickly XP will be adopted.

But for those who just gotta have it, the treadmill is eagerly awaiting.

Published in the Victoria Business Examiner.

Fear not the command line!

Computers are hard to use. This is true no matter how long you’ve been using them, or which type you use. The reason is simple: computers are stupid — as dumb as bricks. They have to be told exactly what to do, and how to do it.

This is often more difficult that it might at first seem. Programmers are known to shout “DWIM!” at the computer terminals in frustration over some language or process not interpreting a command or sequence the way intended. DWIM stands for Do What I Mean.

The debate as to the best way to deal with this complexity has been raging for many years, with two main camps: the Graphical User Interface (GUI) champions vs. the Command Line (console) hold outs. The GUI supporters believe the best way to make computers easy to use is to present everything using graphical metaphors, with What You See Is What You Get (WYSIWYG) being a big selling point.

The console jockeys, on the other hand, believe instead that the best way to interact with a computer is by typing instructions (commands), and receiving the results back in text form. Most command line users come from Unix backgrounds, where the entire operating system is designed to be controlled in this way. GUIs under Unix are available, but always sit on top of, rather than being, the operating system.

Admittedly, for simple tasks, graphical metaphors are great: Point and click, drag and drop, highlight and alter. Such environments have made computers accessible to those who would otherwise be overwhelmed by the technical complexity of operating a computer. Providing the programmers have predicted what operations the users are going to want to perform, this type of abstraction can ease general operations.

But there are limits. Some types of communications do not map well to graphical forms, and thus not all options can be presented. Computer programming itself, for example, doesn’t tend to map to a two-dimensional representation very well. Visual Basic non-withstanding, when creating software of any complexity, good old text files is the way code is usually expressed, containing human readable C or C++.

There’s also the problem of the users wanting to perform an operation which the GUI programmers didn’t envision. As an example, given a directory full of source code files, how would you find out how many lines of code were contained within them? Lines of code is a very common metric for software developers.

Using either Windows or a Mac, this question cannot be answered without loading all the files into a software development environment which happens to have the ability to report this statistic. Basically, special software is required because the ability to answer this question was not expected by those creating the GUI. You, the user, is out of luck.

On the other hand, if you’re running a Unix (or Windows with the Cygnus Tools) command line shell, the following sequence will give you the total number of lines contained in all .c and .h files: “cat *.c *.h | wc -l”. This command is actually a compound of two commands, cat which simply copies all the files out to STDOUT; by way of the vertical-bar pipe symbol, all this output is then directed to the wc command, which simply counts the lines (as requested by the -l option).

As you can see, with a total of 20 keystrokes, it is possible to answer a common question which simply cannot be answered in a GUI environment. The only down side is you need to know the appropriate commands to run, and what the available options are. While quick to intimidate some, those who spend a bit of time learning some of the most common console commands quickly discover how powerful it can be.

Unfortunately, many do not invest the time, and so never learn how to work in a console environment. While this is fine for regular end-users, it can be a serious shortcoming for those in technical roles — there may be times when an interactive console is the only access one has to a machine. It is common, for example, for Unix administrators to do work on machines in different cities, or even different countries. This is almost always done by way of a text console.

As pointed out by Neal Stephenson in his essay In the Beginning was the Command Line (available at www.cryptonomicon.com/beginning.html or most bookstores), those who are able to directly interact with a computer via its command line interface are the most likely to understand what is going on inside. Sure there’s an investment to learn the commands, but with that investment comes understanding.

On the other hand, an exclusively GUI user may, or may not, understand what it is their various mouse clicks are doing. Is the GUI really doing a user a favor when it lets them easily mis-configure a server with only a few clicks? Is hunting around a control panel, trying different buttons and configuration options until things work really the job description for a modern system administrator?

Until computers actually begin to be able to understand commands like DWIM, the command-line interface will remain an important skill set of any serious computer operator. Point and click aren’t going away anytime soon, but neither is the good old command-line.

Published in the Victoria Business Examiner.