Thursday, February 21, 2008

IBM, the PS/2 and Why PCs are as they are...

Herewith an IM exchange from a few years back in which I opine as to when, and how, IBM lost control of PC design and standards. "Tristram" is Darren, a Macolyte par excellence.

Myself: Greetings.

Tristram: How goes it?

Myself: I am having new iMac hallucinations.

Tristram: As am I.. however, my need to keep my finances in some semblance of order has quelled the uprising. I DO think the thing will sell at a solid rate,
however.

Myself: Oh hell yes.

Myself: And I am angrier than ever at the so-called computer makes on the PC side. Looking at the iLamp, I mean, iMac, one realizes how little design and engineering (and thought of any kind) goes into PCs. They really have become mere office supplies, like staplers and fax machines and the like.

Tristram: Pretty damn close thereto, that's true. Function far outstrips form. But for so many PC users, that's just fine.

Myself: All too true. If ya think about it, we really shouldn't speak of "Dell computers" or "Compaq computers" apart from servers or high-end workstations (which actually are designed from the ground up). Apart from those specialty machines, most "brand" PCs are nothing of the kind. They are wholly interchangeable. Someone else makes the mobo, the chip, the drives, etc. Dell or whoever just assembles them (badly).

Tristram: Well, Dell is actually said to be fairly competent. I don't know about the individual MAN, Dell, but they say his company puts together a fairly passable unit.

Myself: Botheration. However, I think Steve is onto something. Apple has to be, well, different. They can't win the "office equipment" battle, because of issues of cost and scale that are - at least for now - insurmountable.

Myself: When I show my PC pals the new iMac pictures, they openly smile. "That's so....great..." is the usual response. And these are longtime, cynical computer guys.

Tristram: Wow. That's amazing to hear... and what did they think of the original iMac three years ago, or the Cube a year and a half ago, just out of curiosity?

Myself: They disliked the lack of a floppy drive and no onboard SCSI port - which even I had to admit was a very fair criticism - a lot of these guys had expensive SCSI devices, and how much would it have cost, really, to slap a 25-pin SCSI-2 connection on the side, next to the USB ports? They loved the iMac's design, though.

Tristram: Well, that is comforting. So they are able to discern what is good and what is chaff.

Myself: As for the Cube, they raved about the design but thought it was pricey (esp sans monitor) and wondered about its market prospects. One said, "You can't have design for design's sake. The Cube should win all sorts of awards... but will it SELL?"

Tristram: But in neither case did they scoff or laugh at the bold designs.

Myself: No. They, like I, love design. We're all ID disciples. The highest achievement of any tool is transparency; do the job without getting in the way of the job.

Tristram: I agree.. thus the appeal of the entire Mac solution.

Myself: Agreed. Although, of course, versatility imposes complication. No way 'round that. But what I like about Apple is that they keep plugging.

Tristram: Yes.. that's definitely the case.. as these machines become more capable, they become increasingly more complex and unstable.

Tristram: Until the advent of the "Modern OSes" that we now have, where a best-of-all-worlds situation can (theoretically) exist. We'll see.

Myself: Right. PCs are horrible in this regard, although WinXP is a big step up. But we must remember that PC development was marketing driven once IBM lost any kind of control (circa 1986-87).

Myself: From that point, security and stability didn't matter. Make 'em cheap, build a lot of them, and sell as many as possible. Oh - and maintain backward compatibility at all costs, as a selling point so folks won't have to repurchase their software libraries when they upgrade. Marketing, marketing, marketing.

Tristram: Right. To that end, meet the parallel port.

Myself: The tech-heads at the OEMs knew then, and know how, how shoddy most of the design was, and how screwy the software was. But their hands were tied. They loudly complained, but they weren't in charge.

Myself: I consider the 86-87 period a watershed in PC tech and system design, not the intro of the AT platform in 84 or the 386 or even the price collapse engineered by Compaq in 1991/2. I have a strange reason, but I think it holds up.

Tristram: Do tell. Most less-informed tech-heads would put the focus on a later time, perhaps, or earlier, I would think. But I'm sure you have a cogent rationale behind this.

Myself: Ok, here it is: in 1986, IBM realized they had utterly lost control of the PC. Dirt-cheap, 100% PC-compatible mobos were arriving from the East (Taiwan, esp), Intel would sell the chips to anyone, and - of course - Microsoft couldn't license DOS fast enough.

Myself: Because IBM had used off-the-shelf parts to make the PC (for reasons of cost and speed of design back when they first rolled it out), they had no control at all over the hardware. The only real "IBM" part was the BIOS, which Compaq cracked -followed by many others. So, what to do?

Myself: IBM's answer was: when you're losing the game, change the rules.

Tristram: Most Kirk-like of them.

Myself: (Trivia aside: IBM's cost per unit for the original PCs was VERY low. They actually ran a smooth, lean operation as far as PCs went. The problem was the enormous, and I mean enormous, overhead of the rest of IBM. But IBM could - and did - build the actual PCs cheaply as Compaq or anyone else did.)

Myself: Anyway... IBM decided to "restart" the PC market with a new series of machines. These were IBM all the way, designed in-house and proprietary (and patented) from snout to tail. No off-the-shelf serial ports or third-party chips here.

Myself: They secured exclusive licenses from Intel and fixed the machines so they would only run IBM's PC-DOS, not the general Microsoft MS-DOS. In order to shoot the aftermarket card makers through the head, IBM redesigned the mobo and introduced MicroChannel 32-bit architecture.

Myself: IBM, in other words, was going out of it's way to make a computer that was NOT compatible with the original PC/XT/AT series.

Tristram: Right. Interesting. A whole new beast that ran their OS well... Mac-like in the sense that they controlled the complete product.

Myself: Exactly. Take THAT, clone-makers! IBM also planned to roll out OS/2 as the preloaded OS - but most of them wound up running PC-DOS because OS/2 wouldn't be ready for several more years. (The OS/2 fiasco deserves its own conversation...)

Myself: So, they called these new machines Personal System 2 (PS2) and rolled them out in 1987 with massive publicity. They even offered a "cloning license", for a steep price, you could start an OEM company and build specialty PS2s (license revocable at will by IBM).

Myself: Some of the PS2s were marvelous. The PS2/80, as I recall, could be disassembled like Lego bricks for service. The PS2s introduced the 3.5 diskette, VGA graphics, the now-familiar PS2-style mouse and keyboard connectors, and a host of other things. There was some solid design behind these things.

Tristram: I remember the PS/2 concept, but I didn't know what was behind it all.

Myself: But they were expensive, proprietary and the performance results were mixed. Had the things been barnburners, the computer world might be very different. But they weren't. Your average Compaq DeskPro 386/20 could match them, and cost 1/3 less, and was - and was industry standard for expansion boards and drives. And that was the killer.

Myself: For the first time in PC-dom, the term "industry standard" not only did not refer to what IBM was doing - it was opposed to what IBM was doing. IBM had actually "Apple'd" itself into opposing the Intel/AT mobo/MS-DOS platform juggernaut; they were knocked flat.

Tristram: Hmm. Interesting in the extreme. Amazing that these culled-together machines, these bastardized Frankenstein's monsters of the computing world, could actually DO anything with something approaching, and even exceeding the aplomb of these finely-tuned machines.

Myself: The kiss of death came when it was discovered that the PS2 series had trouble running versions of Lotus 1-2-3 and Flight Simulator (these were the "reference" software programs used to gauge PC-compatibility) because of video issues.

Tristram: Or Apple's equally-finely-tuned machines.

Myself: Yup. And the PS2's fate was sealed. A number of PC-clone makers formed the so-called Group of Eight and publicly swore they would have nothing to do with the PS2 licensing, MicroChannel or any of it. Shortly afterward, they created a 32bit "new" - but backward compatible - daughtercard and slot design called "EISA."

Myself: IBM just fell apart after that, and would not recover until the mid-90s. A fair number of PS2s were sold to the government and large businesses (no surprise there, eh?) but as far as PC-land went, IBM became irrelevant.

Tristram: Hmph. Well... that's fascinating stuff. I suppose the combined might of a thousand companies pursuing performance and compatibility from a hundred different angles was enough to overcome the singular excellence of IBM.

Myself: Right. Exactly. And that's the problem. With IBM off the throne, leadership passed to... nobody. The only 'leadership' was a committee of hundreds (the cloners, Intel and Microsoft, and the 3rd party software and hardware makers) whose only possible collective decision was this:
    Don't have a repeat of the PS2. Don't take chances. No radical design advances. Maintain compatibility at all costs.


Myself: And we've been there, more or less, ever since. Thus endeth the story. Hope it wasn't too boring.

Tristram: Not at all. Very, very interesting. Though the end result is pretty boring. Thankfully, software design hasn't been quite so stagnant through it all.

Myself: No, thank God for software. And, to be fair, there is something to be said for enduring standards. Alas, I don't see any way out of it. The PC market is so massive that advances are slow, grudging and troublesome.

Tristram: Right. The question becomes - when will the industry MANDATE a change? When will the needs of the customer dictate that something needs to change?

Myself: The "cool" stuff that happens seems to occur only when the expanding PC market bumps into something that already exists. Internet, digital video and audio... these things existed "outside" the PC before you could do them ON a PC. Which is why I think Apple has got it right. Computers are no longer some enclosed world of apps and games and printers and the like... They have to "mesh" with consumer electronics and other communication technologies.

Myself: And the thing about Apple is, they seem to really care about "meshing" properly. At the recent Keynote, Jobs mentioned what a pain in the ass digital photos can be, with different apps to capture, edit and organize photos.

Tristram: Right. Such is the case in many fields of digital lifestyling.

Myself: This is so true. Everyone knows it. But the PC folks won't move on it because - God forbid - that would mean presenting a new standard for integrating these tasks and taking a risk. The PC attitude is one of resignation. "Yeah, it's a pain, but it works, more or less." I'm guilty of this myself.

Myself: The Steve said another interesting thing: we would find it absurd to read, but not want to (or be able to) write. Yet in this video age, we are so much all spectators, but not creators. Dammit, cameras and good editing software are essential to our time. Why NOT have a zillion people making their own movies? Lots of folks have stories to tell. Okay, most of it will be crap - but so what? They’re just electrons. Erase it and try again.

Tristram: So true. The key, however, becomes informing the masses that this is something that can be done, should be done, and is now available. And to let them know WHERE it's at, before the PC world stumbles into something approaching parity.

Myself: Right. And call me an elitist, but quality costs money. Macs are on the balance pricier than PCs because of market share, economy of scale, etc. So be it. But you still get one HELL of a computer for the money, and the excellence of design and ease of use DOES make a difference. And at $1299, the new iMac entry-level is a bargain.

Tristram: Longevity counts, as well.

Myself: Yes, that, too. Buying a Mac is like buying a car. Three years' on, it will still be there.

Myself: Speaking of: after the "Megahertz Myth," Apple needs to take head-on the "upgrade" myth. Most people - the solid majority - should NOT have to buy a new computer every 18 mos or two years. That is, pardon my Swahili, bullshit.

Myself: Yes, some will always need the "latest and greatest." But for most folks, the iMac you buy now should serve you faithfully for years. This whole despicable scheme of building (and buying) crap computers because, hell, in 12 months I'm just going to toss it anyway and get a new one, has got to go. This was marketing brainwashing by companies that based their business models on massive volume sales of soft/hardware and therefore had to convince us there was something "wrong" with us if our machines were older than 2 years.

Myself: My 3 year old iMac runs like a clock. Only thing I ever did was add RAM.

Tristram: True enough. The built-in obsolescence thing has been eaten up by the buying public. They really believe in it. Of course, on the PC side, sometimes the two-year obsolescence myth holds true!

Myself: "Oh no, the new 10Ghz MetaPentium liquid-nitro-cooled PCs are out with particle-accelerated video cards and 20 GB of RAM! I must have one! How can I run Word and get email without it! Where's my credit card?!"

Tristram: Not to mention the old couple that came into the Hauppauge Apple store in mid-November, purchased a Dual-800 G4 with not one, but TWO Cinema Displays and an additional video card to drive the second beast, all expressly meant for... you guessed it... word processing and Internet surfing.

Myself: Wow. Maybe with that rig, they can get RealPlayer to work.

Tristram: Hey - let's not predict wildly here.

Myself: True, true. Speaking of, I wish Apple would release these Keynotes and other QuickTime goodies as downloadable files (for those of us with broadband). No matter what I try, they always look like crap.

Tristram: Same here. That WOULD be nice, yes....

Myself: Akamai servers, my butt. In other exciting news: I got x-rayed this morning, and after one of the shots, I hear the tech (behind the screen) say, "Oh, crap." File that under
    Things You Don't Want To Hear.
It turned out his pen was out of ink.

Tristram: Most definitely! Do you think you'll make it?

Myself: If I use a different pen, yes.

Apple's Dilemma II (2006)

While I am a Mac fan, I have little patience for overheated rhetoric about Apple vs Microsoft as some kind of sacred struggle between Vision and Compromise. I mean, really.

Apple, at this point, has little choice but to emphasize design and the effective integration enabled by their proprietary platforms because it's all they have left. What else are you going to sell the Mac on? Price/performance? The software library? Please.

I think the Mac advocates' strongest points are these:

1) The open architecture of the PC is a mixed blessing. Yes, you can customize and modify PCs in every possible way, but this also opens the doors for every kind of functional and compatibility problem. We PCers take a perverse pride in our tinkering abilities, but most of us have to admit that we are "technical" by necessity, not desire, because the PC platform is a house of cards compared to the Mac - although things are a lot better now that under Win3.1/95/98.

2) The PC architecture serves a "lowest common denominator" interest far more than the Mac's. "Backward compatibility at all costs" is almost a design rule with PCs - and woe to those who truly try to innovate or improve. Not even IBM could get away with it, when they tried to "bump up" the PC to the next level with the PS/2 architecture, control of the industry was ripped from their hands by the Cloners; a committee of companies led by Compaq dedicated to s - l - o - w progress, if any, in how computers are used.

(Look at how long ancient DOS-code persisted in Windows 95 and 98, at the cost of terrible instability and insecurity in those OSes. Think that was an accident? Lazy programmers? Nope. Legacy software and hardware required it.)

We can't be too harsh, though. Companies that spend literally thousands or tens of thousands on hardware and software do not want vendors telling them every 18 months or so, "Ok, everyone out of the pool. We're changing everything." Ironically, Apple's small market share actually frees them to take chances that no other OEM would dare. Look at the way Apple TWICE successfully migrated their entire business across hardware (68k -> PowerPC) and OS (MacOS -> OSX) lines. I have a hard time seeing a PC cloner doing that - assuming they'd have the nerve to even try.

3) Gear-savvy PC folks who come into discussions like this - including me - with their scratch-built, ultra-cheap PCs and hands-on knowhow, are not typical of computer users. Computer users are getting less 'techie' with every passing year. People can debate the desirability of this, but it's a fact. The fact that you can "roll your own" PC means very little to most PC users. For them, the quality of goods and ease of use presented by the Mac are highly compelling.

Side note: What kills the deal, as I said before, is the PRICE. Even if they can't do the homebrew thing, they know people who can. I have built a dozen PCs for friends and coworkers in the past few years and easily half of them were considering a Mac. They just can't look past the price. You can lecture them all you want about interface elegance and iTunes and whatnot, but they see giving me $1000 to build them a good PC vs spending $1300 for the cheapest iMac or $2500 for the cheapest PowerMac.

4) The "software gap" has everything to do with market share and nothing to do with the technical merits of the PC architecture. Half-Life would have run just fine on the Mac, but wasn't released for business reasons. Similarly, cutting-edge hardware folks (like those 3d accelerator makers) roll their stuff out on the PC first simply because there are more of them, and they'll sell more cards.

5) Microsoft's power has got to be reduced. I don't begrudge them the credit they are due. They've done very good work and - for the most part - earned their success. (Thanks, Steve.)

But Microsoft has reached a point in their corporate growth where innovation of any kind takes a distant second place to smashing all potential competition - not by making and selling better stuff in a "May the best man win" way, but through anti-competitive (and illegal) measures. This was the conclusion of the Court after an exhaustive investigation (and was common knowledge in the computer and software industry before the trial).

There is also the little fact that they were found guilty in court of breaking the law to maintain their monopoly - during which case the company was caught presenting faked 'evidence' regarding certain aspects of Windows and IE.

Of course, the actual punishment was a joke. Proof again that if you're big and rich enough, it can pretty much ignore the law.

Microsoft's current goal - which the management of the company makes no secret of - is to remove control of the desktop operating system from the end user.

This will be accomplished in two ways:

First, the gradual shift from a 'retail purchase' to a 'subscription' model. You won't just buy Windows once for your PC, you'll rent it and pay and pay and pay...

This allows Microsoft to keep the Windows "cash cow" going. They knew all along that high volume OEM and retail OS sales and upgrades wouldn't last forever, that at some point most homes and offices wouldn't be buying new computers as often as they were in the late 80s - early 90s. The subscription model is their way of making you buy Windows all over again at regular intervals, whether you need to or not.

Second, changes in the Windows OS architecture and functionality will make it increasingly difficult (if not impossible) to run Windows off-line as a stand-alone product. Your PC will become less a PERSONAL computer and more of a USER TERMINAL dependent on centralized data services.

This is the real motivation behind their enthusiasm for a wired world and increased deployment of broadband Internet access, as well the new Windows activation scheme.

All this would be less worrisome if Microsoft wasn't the only show in town, but it is. Even J. P. Morgan and Standard Oil never enjoyed the level of market control that Redmond now holds.

Microsoft is not evil. It's simply a company trying to protect and maximize its revenue sources. I understand that. But there are some fights it must not be allowed to win. Apple is key to this. But for Apple to make a real difference, it needs to move beyond producing boutique computers for the Pottery Barn set and take a real bite out of Windows' market.

Apple's Dilemma I (2006)

Let's talk money.

What kills the Mac for a lot of people - regardless of what an ID showcase it becomes - is the sticker shock.

Forget the Mac Mini and iMac for a moment and look at the Mac Pro - you know, Apple's REAL desktop computer.

The entry-level model is $2499.00 without a monitor. Add $599 for Apple's entry-level monitor, the 20" Cinema Display, and we've hit $3100.00 without any other purchases. Three thousand dollars - for Apple's entry-level equipment.

Sure, it's a nicely put together and the included software is very impressive. But for three grand I could put together an Athlon or Pentium machine fast enough to travel through time... AND have my software options increase tenfold, if not more.

What amazes me is how successful Apple has been in convincing its loyal customers that it is David in the struggle vs Goliath (IBM initially, then Microsoft) when - as others have pointed out - Apple is the North Korea of computing environments.

The 'closed' nature of the Mac, then and now, is no accident. Many books about Apple and admissions from its own ranks have made it clear that Apple uses the Mac's proprietary technology to maintain hardware profits as much as for any kind of "quality control" over the "entire user experience," as Mr. Jobs recently said.

Historically, Apple became accustomed - one might say, addicted - to making money off the Mac's hardware exclusivity to let it go even if it meant long-term growth in the market share and user base for the Mac OS.

Since this also meant that Apple had to eat almost every dime of R&D for the Mac, it could not profitably license the Mac OS to "cloners" because they, not encumbered by Apple's overhead, would cut Apple's throat in a price war unless Apple "taxed" them heavily for each machine sold, in which case, the clones become nearly as costly as Apple-branded Macs, in which case, WHAT'S THE DAMNED POINT?!

(Indeed, this is exactly what happened with PowerComputing and others when Apple DID briefly licensed the Mac OS.)

This double-edged sword of hardware control is also why none of the many "port the Mac OS to the (name of chip)" projects in the 80s and 90s ever came to anything.

There has been progress, to be sure. Apple has increasingly made its peace with "PC" technical standards in things like monitor plugs, hard drives, and RAM modules. ADB is gone. SCSI is gone (mostly).

The "killer app" for the Mac OS is the OS itself and always has been. THAT is the 'Mac Experience,' not a MOMA case design. Apple must, in the end, give up its fetish for hardware control if it ever wants to be more than a marginal player, and must get the price of entry for the Mac OS platform down to a competitive level with Wintel computers. Unless that happens, nothing else will matter.

The bottom line for me is: Apple builds clever and pretty computers that cost about twice that of the competition's machines of equivalent power and have a fraction of the software support.

My price comparison concerns the PowerMac G4 for a reason: because the iMac is a closed system, it is not a fair comparison to a mainstream PC. This does not mean the iMac or Mac Mini are BAD, or that the quicksilver nature of PC technology and mutability of PC subsystems are unqualified blessings. They can actually be massive pains in the backside, as I encountered recently when assembling a new Athlon/XP system from Microsoft Approved reference parts that just refused to work (I wound up replacing the mobo).

However, if by "computer" we mean a system where you can make significant changes or upgrades to the key systems (CPU, internal fixed disks, video cards, etc.) then we're not talking about something like the iMac. It and the Mini are great products, but "frozen." It's more like buying a laptop - or even a piece of stereo equipment or a television.

This premium pricing also affects the success and spread of OSX. OSX is GREAT - although its adoption by Apple was an admission that Apple had dug a hole with the old Mac OS that it couldn't escape. But as long as you can only get OSX on such an expensive platform, it will remain marginalized.

We can - broadly - sum up the scene thusly: Apple makes better computers, but the Wintel computers are cheaper, support more software, and are GOOD ENOUGH.

Therein lies Apple's dilemma: its foes can "build up" through continual refinement and enhancement of Windows - and with XP and Vista, they largely have - while Apple cannot "price down" to match them.

If you want a Mac that's not a closed system, you have to start - to START- at damn near $3000. The charm of iMovie burns away pretty quickly in the face of that.

With market share, of course, comes software - or the lack of it. At a certain point, developers will simply not bother to do Mac versions of their mainstay titles. Yeah, we can all run Office. So what? In most specific business applications, the Mac is simply absent. You could easily spec out a new company's entire network - servers, desktops, laptops, network attached storage, etc. - and ignore the Mac entirely.

Yes, even Mac strongholds like PhotoShop work perfectly well on Windows boxes that are - even when you adjust for huge sticks of RAM and the zippiest video cards - cheaper than Macs.

Allow me an example from my own company. We do residential and commercial security system and camera monitoring. There has been a huge shift over the past several years from large, expensive (and finicky) custom-built systems to a smaller "LAN" architecture to automate the handling of video and alarm signals.

The companies that write and maintain the software essential to running these systems - on both the central/server and dispatcher/workstation side - would laugh out loud if asked, "Um, does it run on a Mac?" It's Win2000, XP or (sometimes) Linux. That's it.

No doubt anyone reading this could give you dozens of examples, right off the top of the head, of software markets where the Mac either never existed at all, or has vanished. And, once again, it's not because the Mac technology is bad.

PCs, Macs & Hassles

What is it with this fetish of endless hardware tweaking among PC users? Some so-called high performance PCs remind me of those hyper-modified cars that are showpieces of technology but are hardly even driven because the engineering is so unstable.

Also, I think we have already reached the point of diminishing returns in all but the most demanding software. Sure, PhotoShop eats computers for lunch, but I've still got a Pentium 3 in my office with "only" 512MB of RAM that runs WinXP and MS Office2K like a champ. It also handles email and web surfing just as well as any nitro-cooled, 733t h4xx0r monstrosity you could find.

Sure, I've got a game computer "dragster" but it's a freak machine. I don't do serious work on it and if it blows up, I won't lose anything valuable. (I back up my config files and saved games routinely.)

A computer that is cheap and fast but also unstable and high-maintenance is not a good machine for important work.

Another thought: one of the editors of MacToday (Scott Kelby, I think) once made the point that when you browse the shelves of Mac books, they are almost entirely devoted to actually DOING things with your Mac. That your Mac is working properly is assumed.

The PC library, on the other hand, is chock full of diagnostic, repair and troubleshooting volumes. At least half the PC books at my local Borders' are devoted to the problem of PCs simply not working.

An observant eye will also notice how many of the top-selling PC software titles, year after year, deal with simply keeping the various versions of Windows running or negotiating hardware hassles.

Ask any computer support person how often they've had to respond to emergency calls for Macs constantly crashing or going haywire - even in a Mac-heavy environment. This might account for the MIS community's opposition to Macs; their very existence presumes offices filled with computers that need constant attention... look at a Mac office and, 99% of the time, you'll see some MIS guy reading Wired all day long and a layer of dust over his stack of "emergency recovery" disks.

The Left Hand of Dorkness

I have an Xbox 360 video game console. I am left-handed. This is a problem.



It is a problem because every single video game controller made for this console is designed for right-handed people.



The controller is designed with the assumption that you will use the right thumbstick to look/aim and the left to move. This is fine for right-handers. Left-handers, however, prefer (in some cases, absolutely need) to move with the right stick and look/aim with the left - just as we do with computer mice, pens, brushes, scissors, handguns, etc.



While certain A-level games like Halo 1/2, Gears of War and Call of Duty have what are called "Southpaw" control options, where the triggers and sticks can be switched, many other games do not have anything. In my own Xbox 360 experience so far, Lost Planet, Saints Row, Dead Rising and most Xbox Live Arcade games including Doom and Assault Heroes make no effort to accommodate left-handers.



Add to this the inexcusable fact that nobody makes a left-handed XB360controller; not Microsoft, not MadCatz, not Logitech, not Nyko, not GameStop. This drives me almost to the point of rage; Microsoft is designing every kind of gizmo imaginable to interface with the XB360's wireless support, even one to enable the console controllers to work with Windows, but I can't get a goddamned left-handed controller? Another victory for the PC, I guess.



Indeed, this is an old problem. With the notable exception of Nintendo's Wii, video game console systems have always punished me for being left-handed. I have had to grin and bear (grip and bear?) the lack of left-hand support for 30 years now and I have had enough. Why shouldn't I and the millions of other left-handers have a proper controller? I've actually sent letters and made calls to Microsoft & peripheral makers about this. Their responses ranged from blasé dismissal to incredulity that anyone would even ask for such a thing.



Of course, Microsoft could simply produce (or authorize another company to produce) a proper mouse & keyboard set for the XB360, which would solve this whole left/right handed thing once and for all. Microsoft could also make it mandatory that game developers include alternate control options in their XB360-licensed games. Don't hold your breath.



Well, screw them all. I want some left-handed hardware. I want an actual, for-real Southpaw Controller. If I can't buy one, I'll make one or pay someone to do it for me.



Pursuant to this, I found and downloaded a guide to modifying the XB360 wired controller for left-handed use. Since my using a soldering iron is expressly forbidden by the New York State fire code, I contacted the author and asked if I could pay him to do one for me. It turns out that he gets a lot of these requests, so for $60 (including the cost of the controller) he switched the wires on the triggers, thumbsticks and thumbstick buttons. The ABXY buttons and D-Pad are trickier, so I asked him not to bother.



It works like a charm. I tried it out with XB360 Doom, Saints Row and Dead Rising. All good so far. This is the best $60 I ever spent. Well, except for... uh, never mind.



The fellow's name is Ron Alexander. With his permission, I am posting his email address here:



rondcrasher (at) yahoo (dot) com



(Email written thusly to avoid harvesting by spammers.)



This is a stopgap measure, of course. I still want an official, authorized, professionally-made Southpaw Controller. We left-handers should have them. We've waited long enough.

Monday, February 11, 2008

Animal Rites

You know what PETA, Veganism and the Animal Rights movement strike me as? Ideological one-upsmanship among Progressives. People join them and espouse their rhetoric because it raises their status among the NPR crowd.

Think about it: All good, progressive-thinking men - excuse me, PEOPLE - already embrace the "struggle" against sexism, racism and whatnot. Nothing special there. So why not take it to the next level and extend your Liberation Politics beyond humanity? NOW we're talking.

Why settle for White Guilt when you can lament your entire species?

Anyone can march at some protest for equal rights for gay people, but you're well beyond that. You march for equal rights for NON-people!

Why confine yourself to discussions of Privilege and Patriarchy when you can make deep statements about the oppression caused by most animals not having opposable thumbs?

Also, since the production of cheap and plentiful food - like pretty much everything in the modern world - operates on the vast economies of scale and technological efficiencies made possible by modern industry... you get to be mad at Big Evil Corporations for yet another reason!

Finally - and this is really sweet - animals are the ideal "victims" against whose "oppression" you can "struggle." It's the Liberation Politics version of job security - like a college professor getting tenure. Unlike those Eastern European proletarian ingrates who finally told the international Left to stick Marxism up its collective ass, dolphins and factory-farm chickens will never tell you to go screw yourself.

(Side note: I'd be interested to know how many hardcore PETA-types have ever refused critical medical treatment on the grounds that the drugs and/or surgical procedures to be used had been developed with animal testing. I'd bet you could fit them all in my car.)

I suspect that many liberationists - not ALL, mind you, but many - really don't give a rat's ass about those on whose behalf they are supposedly fighting. Like all moral crusaders, they're out to feel better about themselves.

There is also a heavy element of self-aggrandizement and the sense of being part of some kind of Consciousness Elite. This leads them to ever-more-extreme positions, in order to remain in the avant-garde.

For these folks, it's just no fun anymore to focus on human suffering. There's a "been there, done that" feel to the whole thing.

Anti-racism, for example, has succeeded to a degree where it is the dominant thought-mode - more people agree than disagree - and therefore it no longer distinguishes one as a member of the Consciousness Elite.

Which does not for a moment mean that the fight for racial equality is over, only that it's just not as politically sexy as it once was. How, as a good revolutionary, can you epater les bourgeoisie when they hate the Klan as much as you do?

So you move onward and outward to animals or "the Earth," adopting lifestyle politics and nutritional practices which most people will never be able to follow. Presto! Romantic marginalization and revolutionary purity is yours again! You're a rebel, maaan!

Good Night and Good Riddance

I tried to sit through a showing of George Clooney's Good Night, and Good Luck. I really did. I had to leave when it became apparent the film was just another example of the American Left's historical mythology.

I should have expected as much. Honestly, the Left has such a hold on Hollywood that when I hear some director blathering on about how his next film is going to be "political" or about "issues," I pretty much assume it will be Left propaganda.

Don't hold your breath waiting for a balanced (or even honest) film about McCarthy, et alia. Remember what recently happened when Kazan finally got his due recognition? The white-lipped anger and venom on display? Decades later, when even the Russians have admitted McCarthy was not jumping at shadows, the American Left cannot admit the truth.

I mean, really... would it kill Hollywood to make a movie about McCarthy that acknowledges the reality of Soviet espionage and subversion?

McCarthy & Co. made their share of mistakes, to be sure, but the enemy they were fighting was real and dangerous. Following the end of the Cold War, mountains of espionage and intel documents became available from the former USSR. That, plus now-declassified materials from the USA (such as the Venona transcripts) show that if anything, Americans weren't paranoid enough about subversion and infiltration. McCarthy was not perfect - not even the best man for the job. But he did quite a bit of good. First, and most obviously, the mere act of going on the offensive against Red infiltration put the Soviets on the defensive. They were forced to become much more careful, cut certain recruitment operations altogether, and did in fact lose a number of key operatives to McCarthy and HUAC's investigations. In the end, he was not the best man for the job because he cast his net too wide and wasn't patient enough.

I find it interesting that "McCarthyism" is a voodoo word coined by the very same American Leftists who spent the entire Cold War bending over backwards to avoid confronting the true nature of the Soviet Empire - a political system that raised repression, paranoia and persecution to an art form.

From the first, McCarthy was loudly criticized by his foes - both CPUSA fellow-travelers and patriotic Americans who honestly thought he was wrong. People against him - common citizens as well as political figures - made speeches, wrote books and newspaper articles, went on TV and radio and readily availed themselves of every inch of their 1st Amendment freedoms.

How many were "silenced?" How many doors were kicked in at midnight, the entire family being dragged away for "questioning?" How many hundreds and thousands of journalists, political opponents, artists, scholars and intellectuals were marched away to re-education/work camps - many to never return? How many were tortured or summarily executed?

Even at the height of "McCarthyism," you could slam Joe McCarthy in a New York Times op-ed, published in your own name, go home that night and sleep like a baby. That's what Edward Murrow did, and if George Clooney considers such efforts to be "courage" on the level of some Russian telling Lavrenti Beria to kiss his ass, he is simply delusional.

I'm also getting tired of hearing about all these supposed "victims" of McCarthy and the anti-Communist effort in general. Who were/are these people? How often did the spotlight fall on good Americans whose words and deeds gave no grounds whatsoever to raise suspicion?

Is the existence of actual sedition among CPUSA members, leaders and sympathizers incidental or irrelevant to how McCarthy and the anti-Communist movement more generally are to be viewed?

Or, let me put it this way: Why is the word witch hunt used so often in conjunction with McCarthy or Anti-Communist unless the intent is to suggest that there were as many seditious Communists as there were actual witches in Salem - that is, none?

Why is this? Witch hunt, we hear, over and over again. Could it be that the American Left - and especially their Arts & Crafts contingent in Hollywood - doesn't want anyone to look too closely at just how taken they were with Communism, up to and sometimes including placing themselves in service to the USSR?

If the CPUSA had been a bona fide domestic reform movement, if the anti-communists really were chasing phantoms, that would be one thing. Then I, too, would demonize McCarthy and his allies.

But that's simply not true. The problem was real.

(For how real it was in Hollywood, check out Red Star Over Hollywood: The Film Colony's Long Romance With The Left by Ronald Radosh.)

Witch hunt, my ass. There were no witches in Salem, but there sure as Lenin were Commies in Hollywood and the State Dept.

Every serious discussion of this subject must begin with the recognition that the CPUSA was a subversive fifth-column movement controlled by a hostile foreign power. That is an objective fact.

Having recognized this, the question is (and was): What to do about it? The Bill of Rights is not a suicide pact; our society has both the right and need to defend itself from those committed to its destruction.

It fits the romantic self-image of the American Left to see those investigated by HUAC, the FBI and McCarthy as martyrs to Cold War hysteria and the eee-vil Right Wing, but the facts do not support that story and never did. Of course, Hollywood is the land of dreams, the American mythology factory, and the Leftists who populate it do their best to keep slicing the baloney.

On a more general note, I would call out both the American Right and Left for their respective blind spots regarding 20th century Communism.

The Right saw Commies behind every tree and was tone-deaf to the real social and political injustices that made Communism so appealing to many suffering and oppressed people.

The Left, on the other hand, bent over backwards to avoid noticing just what all that rhetoric about workers and peasants and liberation actually resulted in once the Reds took over a country.

Many on the Left spent the decades from the October Revolution right up until the whole thing collapsed in late '80s hoping against hope that the USSR or some other Red State (heh heh) would incubate the long-awaited socialist utopia and lead humanity into a new and better age.

Marxist Communism is a remarkably "religious" ideology in this way, and those Americans who went beyond hoping and decided to lend a hand - Hiss, Fuchs, Rosenbergs, etc. - were not villains; they were True Believers. They honestly thought they were on the right side of history and human morality. (All the same, this doesn't excuse treason or change the propriety of their punishments.)

Nothing dies as hard as a dream, and it should come as no surprise that many on the Left (in Hollywood and elsewhere) were, and are, loathe to consciously and publicly confront the fact that the Bolsheviks and Maoists took them for a ride, used them and made a mockery of their democratic, egalitarian and emancipatory ideals.

That's why episodes like l'affaire Rosenberg is still so touchy for some. It's bad enough to have to admit that they were spies, after all, but to also acknowledge that they were naive stooges of a horrible tyrant is just more than some people can bear.

Movie Hate: War of the Worlds, 2005

I broke my movie boycott to see this thing. L Ron would have wanted me to. Observations follow:

(1) Next time, if there is one, I am bringing earplugs or an iPod to the theater. I could not believe the volume of the pre-show screen adverts, loathsome in-theater radio station yammering and coming attractions. My ears were ringing before the film proper even began. Am I the last person who remembers when people could sit in a theater before the film started and just talk to each other, read a magazine and get an early start on some popcorn munching? Ye gods.

(2) To the fellow who sat behind me: I am tall (6'3"). I am wide. I am a visual obstruction. You should have chosen a better seat. Sorry, but there it is.

(3) Ah, the opening. Good Morgan Freeman voice over. I think he's James Earl Jones' heir apparent for Wise Black Voice now that Ossie Davis has left us. (PS - I would trade the lives of every Gen-X, nose-ringed clown in the theater that night for Ossie to have one more year. What a loss; gifted actor, remarkable man and class act.)

(4) Ah, the film... I was afraid of this. There's an old saying in journalism: Don't bury the lede. It means keep your focus on the point of the story, not the incidental or peripheral. This film sent its lede to the bottom of the Marianas Trench. One might think that the conquest of Earth by aliens (not actually Martians this time) would be the point of the movie, being a pretty important thing, after all. Nope. Here it's merely the setting for a familial reapproachment between an immature father and his children - both of whom are the kind of insufferable "movie kids" I would have doused with A-1 Steak Sauce and thrown to the advancing alien war machines as a peace offering.

(5) Cruise is pretty good, although stopping myself from yelling out Xenu or Katie Holmes jokes caused me actual, physical pain. Truly, I suffer for art.

(6) The aliens are not from Mars, as mentioned previously. Okay. No harm done. And kudos to Spielberg for his old skool use of tripod walking machines instead of the floating manta-rays of Pal's 1953 version - as wonderful as Pal's were.

If the changes had only stopped there. Alas, the 2005 story has the aliens delivering their pilots to machines buried deep in the earth "over a million years ago," as one character observes, rather than the ships arriving in meteor-like LCMs (Landing Craft, Martian).

This revision makes an absolute hash of the invaders' defeat from a microbial Achilles' Heel, since it's inconceivable that such an advanced race, having visited Earth before to bury their machines, would be caught unawares by the basics of Terran biology.

Pal's (and Wells') versions explained this by having the Martians view us from afar, but not actually experience our ecology until the invasion. Wells, in fact, was making a direct reference to the tropical diseases which struck European armies during their colonial and imperial conquests in Asia and Africa.

Spielberg's story has character dialog making pointed references to how well the aliens have "planned" this, yet once they arrive, all it takes is a few deep breaths of our air and a few swallows of water before they're all dropping dead.


Tripod driver Xerghon: Sir, do you smell that?

Tripod commander Gharzhek: Smell what? That's just Terran air.

Xerghon: I don't feel so good...

Gharzhek: Stop goldbricking and get this thing moving. We're due in TEE-neck, whatever that is.

(Xerghon coughs, falls over backwards and dies.)

Gharzhek: What the...?!

(Gharzhek vomits his innards across the control panel, then dies.)



(7) Along those lines, how did hundreds of house-sized war machines, buried a few hundred feet under the ground, go undetected by 20th Century human technology? The film takes place in New Jersey, which was (and is) part of the great post-WW2 buildup of America's east coast. Highways, commercial and residential surveying, tunnel excavations for sewers and utility lines... I doubt there's a single square mile of that area which hasn't been explored, mapped, thermal-imaged, sonar'd and whatnot.

(8) George Pal's version alternated scenes of government and military leaders making plans with the personal story of Dr Forrester. By keeping the story locked on Cruise's character and his brats, Spielberg loses the larger view to the detriment of the story. The aliens are handily conquering the word... and then they all just die.

Bottom Line: As befits a Spielberg film, the technical quality is outstanding. And there are a few scenes - such as the hellish, apocalyptic spectacle of a burning train roaring through a railroad crossing - that are simply marvelous.

Alas, there's too little of that and too much family drama. Near the end I wanted the aliens to disintegrate them all so the narrative could move on to a more interesting viewpoint.

A few other things:

- How did Cruise's son survive? Do the alien war machines have a Great Walls of Fire attack which can obliterate a battle line of armored vehicles but is too weak to take out one moody teen in a hoodie?

- The "All Tom's Children" material was bad enough, but why did the writers have to hit rock bottom by including the I must! / You can't! moment between Tom and his son?

Dad (yelling into son's face): You can't go! I won't let you!
Son (yelling into dad's face): I have to go!

Dad: I can't lose you again!
Son: I have to see! I have to fight!

Dad: No!
Son: Yes!


I mean, COME ON. Does it get more cliche than this?! When you have characters literally yelling out their motivations to the audience and each other, are you even telling a story anymore?

Finally, was there a scene cut wherein it's explained that the red vines were part of the aliens' (orig. Martians') ecosystem, which were transplanted here in order to reform the environment to their liking? Note to screenwriters: not everyone read Wells' book.

PC, In Brief

The central tenet of PC is that people are basically ticking bombs of racism, sexism, homophobia (add your favorite "ism" here), and this bestial inner nature can only be controlled through sanitizing every aspect of intellectual, cultural and aesthetic expression, so as to remove any triggers that might set them off.

Of course, the great unwashed cannot be expected to exercise internal control. Therefore, self-appointed, elect groups of Sensitive People must enforce external control and clean up everything that might negatively affect the consciousness of the masses.

A complementary PC belief is that the members of certain designated "Victim" groups - such as racial or religious minorities - have mental states so porcelain-fragile that great measures must be taken to insure their emotional comfort and the integrity of their self-esteem.

Thus, when one such Victim declares that something is "offensive," the complaint must be treated with the gravest sincerity and resolved with all haste. This need to avoid "offense" and "insensitivity" trumps any and all concerns about freedom of expression.

There's more to PC, of course, but this is the essence of it.

The Good War

First, read this post by the formidable Brian at Peeve Farm.

I've thought about this a bit myself. Here's my (somewhat edited) response to Brian:

I'm of the opinion that the modern Left loves to reverently invoke WW2 because its a safe way to claim they're not completely out to lunch when it comes to national defense and related issues. See, they can say, we were all for fighting Hitler!

It also doesn't hurt that the Nazis were/are the perfect foes for the Left, whose hive mind is notoriously emotion-ruled. Hitler and the gang were downright demonic - right out of Central Casting - and made to order.

The Nazis were also, let's not forget, the target of the "anti-fascist" united front preached by the CPSU and picked up by the American Left, echoes of which can be heard to this day in the Leftist habit of calling anyone to the right of them a "fascist." This is just plain weird, as the Bolsheviks were at least as fascistic as the NSDAP. Yet how many Leftists, past or present, would equate the Hammer & Sickle with the Swastika as a symbol of evil and oppression?

Of course, when Stalin and Hitler kissed and made up (with the corpse of Poland as a token of mutual friendship), most of the American and European Left turned on a dime and, well, so much for the united front.

I can't help wondering... if WW2 had never happened, and a Nazi-Bolshevik entente had gone on to swallow up most of Eurasia into totalitarian darkness, would the contemporary American Left still use "Nazi!" as their epithet of choice? Would "progressives" regard the Final Solution with the same almost-total silence and willful ignorance displayed whenever the subject of Communist mass-murder came up? Hmm. Topic for another day, I suppose.

There is also the issue of phony courage, something near and dear to many "progressives." Hating the Nazis and celebrating their destruction was the safest political stance once could take in the post-1945 era. In the past 200 years, no regime has ever been as comprehensively liquidated as that of the Third Reich. Even the most recent pre-Hitler European conqueror - Napoleon - escaped with his life and one of his Field Marshals founded a royal dynasty in Sweden!

The Swastikettes, on the other hand, were hung, shot, driven to suicide or chased to the ends of the Earth. GOOD, says I. But the point is that dancing on the grave of Nazism and idolizing the war that dug that grave was a pretty risk-free activity after WW2; no matter what you said about the Nazis, there was very little chance that Otto Skorzeny was going to show up at your door with a Luger.

On the other hand, being comprehensively opposed to tyranny - you know, like a true classical liberal - was another kettle of fish entirely. As we both know, a regime just as bad as Hitler's survived the war and its blood-soaked leaders mostly died in bed (unless killed by each other or by Stalin, the worst snake in the nest).

Add to this the decades-long, collective admiration and apologia by "progressives" for the Soviet State - for most of this period, 'Kremlin' was to 'Leftist' as 'Mafia' was to 'gambling' - and the right-thinking Leftist had a real dilemma.

Solution? More WW2 nostalgia, please. Big Red was an ally, the evil Nazis were the foe, and everything was ideologically comfortable.

Let's also not forget to notice the prevalence of the European theater over the Pacific in such nostalgia - even though we fought the Japanese for longer than the Germans. The German Nazis, being Caucasians (indeed, fancying themselves the Whitest of the White) pass the PC test whereas the Asian Japanese make for a somewhat less comfortable foe for the Sensitive.

There's also that messy Atom Bomb thing. And in terms of snappy uniforms, it's no contest.

A Peace of Their Mind

Just finished watching one of those current events chat shows wherein some German whined - as per usual - about how much more "advanced" Europeans are over the crazed cowboys here in the USA.

He said, at one point, that after WW2 Germany and Europe had "learned" the value of peace, diplomacy, etc. And this is why they stayed out of Iraq and Afghanistan, he claimed.

Oh really?

If a man who's spent several days repeatedly punching himself in the face finally decides - after two black eyes, a broken nose, half his teeth knocked out and serial concussions - to stop punching himself in the face, is this a moral epiphany or merely a belated survival response?

The post-WW2 Great European Age of Peace came about because the Europeans, after wrecking themselves in back-to-back wars, were too depleted and exhausted to have another go.

There is also the little fact that two extra-European powers jumped in and pretty much took over the entire European polity - USA from the West, USSR from the East. From 1945-1989, Europe's destiny was wholly in the hands of foreign masters in Moscow and Wash DC, respectively. During this period it didn't really matter WHAT the Europeans wanted.

Europe spent this period serving three uses:

1) Strategic buffer between the American and Soviet Empires (with Germany itself being the actual 50 yard line).

2) Junior member in economic and financial partnerships with whichever foreign master a particular European country had fallen under.

3) Social laboratory for the respective American and Soviet systems to rebuild in their respective images to see which side could create a more prosperous postwar order - no points for guessing who won that contest!

Needless to say, this extended period of servitude has mind-warped Europe something fierce, to the point where many of them are consumed with opposing America before all other goals - incl winning the war on Jihadism.

Some Europeans remind me of teenagers who want to take dangerous and stupid risks just because Daddy America says not to.

They also love the false bravado that comes from ostentatiously "standing up" to the USA, safe in the knowledge that we're not going to do anything more than roll our collective eyes regardless of what they say about our President or country.

So you'll understand if I am less than dazzled by their moral enlightenment.

Oh, one more thing about "peace":

Choosing nonviolence is only morally admirable if violence is a viable option to begin with. If the smallest, weakest kid on the playground solemnly declares, "I will not be a bully," it doesn't mean a goddamned thing.

Show me the big, strong kid who sincerely thinks it's wrong to push his classmates around - NOW we've got something meaningful; now we're talking about conscience and ethics.

The simple fact is that most of the nations who loudly and pompously proclaimed they wouldn't help the USA in knocking over Saddam or the Taliban were in no position to help anyway, even had they been spoiling for a fight.

When countries whose ability to project serious military force is pretty much ZERO tell us they won't fight with us - does that really mean anything, or is it just empty posturing?

Yeah, the Europeans "choose" not to fight - just like sheep "choose" not to attack wolves.

"Peace," my ass. On the other hand, why let them have all the fun?

Here's my contribution to "Peace:" I will not walk into the nearest biker bar and loudly announce Only assholes ride motorcycles!

Not because I'd be beaten into the ground, but... you know, for "Peace."

Bewarez

On the matter of piracy and computer games...

First, I'm not talking about "Abandonware" - old, off-the-market games that you couldn't buy even if you wanted to. As much as it pains me to say this, being a little naughty is just about the only way to get a chance to play some of the all-time classic titles. (Consider the case of the classic economics/strategy game M.U.L.E., which has been unavailable for 20 years. Even if you found and bought a copy, you'd need an Atari 800 or C64 to run it. And after 20 years, how many of those single-sided floppy disks have survived? How many COULD survive?)

I am talking about the mass piracy of current retail products. Let's assume that BananaSoft's hot new game, BananaQuest, arrives in stores for $50. If some h4xxor clown rips the disc image and FTPs/P2Ps it to his favorite warez site, where 500 of his closest friends grab a copy, that's a loss of twenty five thousand dollars. Now, 733t d00dz reading this will immediately object because I am making the assumption that everyone who grabbed a pirate copy would've been willing to buy the game at retail, rather than just giving it a pass altogether. Therefore, they'd respond, we can't know how much money BananaSoft has actually lost.

To quote the ancient Greek philosopher Plato, "Bullshit." While we cannot know for certain exactly how many pirates would become purchasers if they had to bite the bullet and lay down $50, I feel safe in saying most would. Computer gamers, after all, want to play computer games, and a signature title like BananaQuest would be irresistible to the solid majority of them. This argument - "I pirated the game so I wouldn't have to buy it, but if I had to buy it I might not have wanted it after all, but I did want it so I pirated it" - is just nonsense. It's a facile rationalization to smokescreen theft.

Theft? Yes, theft. The folks at BananaSoft put time and effort into BananaQuest and recover their expenses by charging people for the finished work. If you take their work without compensating them, you have stolen their time and effort. It's that simple; a child could understand it.

There is also the detrimental effect on the larger World of Gaming... I was chatting a few months back with several people on a Direct Connect hub who professed to be ardent Sega Dreamcast fans. At one point, I remarked that while the console's death was shame, at least the firesale prices at the end let us all stock up on titles, extra controllers, etc.

They were shocked to hear that I actually bought games for any price. Their entire DreamCast libraries were warezed - every last game. They had dozens of pirated games each, which they offered to share with me. Now, I know Sega made it laughably easy to pirate DC discs, but even so... When I commented that massive piracy might have caused - or at least hastened - the demise of the DC, they became furiously defensive and asked me if I was "some kind of cop."

The next day, the DreamCast pirates got the mods of that Direct Connect hub to banish me. According to the email I got, I am a "fascist" for opposing no-limit, no-questions-asked warezing. Sieg Heil, baby.

The following is a list of defenses of piracy raised in that discussion, and my responses to them.



Piracy is not like theft, because you're not stealing a physical object. It's just computer code



Yes. The special, non-corporeal nature of software makes duplication and distribution very easy, since there are no physical materials involved (as with shoplifting books) and there is no quality or functionality loss over successive generations of copies (as with copying video tapes). The warez folks seem to believe that, since piracy is so damned easy and carries so little individual risk, it occupies a different ethical category than walking out of a music store with the latest Moby CD in your pocket. This is a self-serving lie. It's the Might Makes Right mentality; in this case, the idea that people who are technically clever enough to pirate games are entitled, as some kind of class privilege, to do it.

Okay, let's get this clear: you do not "own" the intellectual property in your computer games, songs, books, videos, DVDs and whatnot unless you hold the copyright. What you have is a limited user license which, under copyright law, allows you to do certain things and not do other things. Generally speaking, personal use and archival copies are okay but mass copying and distribution are not.

Many software stores don't allow returns - I have to pirate stuff so I can "preview" it and not get burned.


The harsh No Return or Same Title Only return policies of most stores are a direct response to rampant piracy. Merchants were forced to take measures to protect themselves against people buying a game only to pirate it and then returning it for a refund. Some stores still accept returns because they prefer to keep people happy and will accept the 'hit' from the inevitable piracy. That's up to the store.

This is where the nature of digital media comes into play. If I buy a book on Friday and return on Monday for a refund, the bookseller knows there is very little chance that I spent those three days making dozens of copies of it - which in turn could be made to make dozens, if not hundreds, more copies, all of them perfect reproductions of the source.

With analog audio/video sources, it's a little easier - I don't need a printing press and truckloads of paper as with duplicating books - but still takes some doing. I'd still need certain equipment and know-how to mass-replicate them without degradation. Even with that, there is a noticeable reduction in quality between generations of audio and video tape - there is also the time element. Unless you have access to high-speed duplication equipment, copies take a lot of time. The gradual loss of quality and the "hassle" of it all serve as brakes upon mass duplication.

With digital media, it's another story. Huge numbers of copies can be made and distributed very quickly and the fidelity to the source material is PERFECT. Little kids can do it. And that's why merchants have lowered the boom.

So what about customers that get stuck with bad games? Well, when a game is "bad" in the sense of being incomplete and barely functional as a software product (like the infamous Ultima 9), then Consumer Rights do come into play. A similar case can be made for games which are advertised and sold on features they don't actually contain (like the infamous Outpost).

If a game is simply "bad" in the sense of boring, cliched and terrible gameplay, then...well, that's life. Be a more careful shopper. Besides, the release of any major game is accompanied by a withering barrage of publicity and critical scrutiny from the paper and online gaming press. Following the release of Morrowind, for example, I counted three dozen detailed reviews on various gaming sites. Within a month of its release, there was more Morrowind information and evaluation than you could possibly read.

Another thing to consider: the better a game is, the more people want to play it and the more inclined they will be to pirate it (assuming the "piracy is no big deal" attitude prevails). One of the most heavily pirated games in recent memory has been WarCraft 3, which is a quality title that's certainly worth the $60 Blizzard charges, put out by a company that is as "pro-gamer" as you can get.

Blizzard is a "good" company making a "good" game. Has this made a difference? No. Warcraft 3 was boosted right and left, which totally gives the lie to the notion that warezing is some kind of consumerist guerrilla action.

Nor is this a new problem; piracy is as old as computer gaming. CD-Burners have been around and affordable for several years now. Before that, pirates relied on the few who had burners to rip the discs and compress/transfer them to a series of floppy disk images, and the titles were moved around in that form. Prior to the availability of broadband Internet access, private BBSs and face-to-face "swap meets" were used to move the goods.

Games on floppy disks were widely pirated - indeed, one of the most notorious pirate operations on the East Coast of the USA - Pirate's Cove, on Long Island - specialized in floppy disk games for the Atari 400/800 and Apple 2/2+. Even then, piracy was not a matter of "customer revenge" against bad games or retailers. The most pirated games were the best ones - M.U.L.E., Starflight, Archon, the Ultimas, etc.

Piracy is not that widespread / that big of a deal.


It's rather hard to quantify something which is illegal and covert by nature. Still, given the prevalence of CD-burners, broadband access for moving large files and ISOs, and easy-as-pie P2P setups like Direct Connect, and given people's proven willingness to get things without paying for them - especially if there is virtually no risk of getting caught - I think it's safe to say that song, movie, TV show and game piracy is widespread.

An important thing to consider is how EASY pirating has become. The whole scene used to be quite "clubby," but with the advent of broadband, burners, etc. it has exploded. Now anyone with a cable modem or DSL line who knows how to set up Direct Connect can run amok in Pirateland like a kid in a candy store. The essential nature of the activity has not changed. What has changed is the ease and scope of it.

Napster, the World HQ of music theft, was a good example. Napster didn't become popular because a few audiophiles were trading handfuls of obscure songs within their little hobby circles. The appeal of Napster was that multitudes of people were "sharing" mountains of MP3s - most certainly including current, commercially-available music.

As for the cost of game piracy, let's do some math. Suppose I run a peer to peer service - like a Direct Connect hub - and I "share" Warcraft 3 with my fellow gamers. Word gets out and 1 person a day downloads it over a three month period until I need the drive space and remove the files. (These are very conservative figures for a hot new game, by the way.)

Blizzard is asking $60 for Warcraft 3. $60 x 31 downloads/month x 3 months = $5580. Five and a half grand is not a trivial amount.

Games are too expensive.


If Blizzard wants to charge $100 for Diablo 3, it's up to them. If I don't like it, I don't have to buy it. If I say, "Well, they need me, Champion of Fair Pricing, to teach them a lesson!" and go around pirating it, I am a nothing but a thief. I've been a computer gamer since the Apple ][+, and I can tell you that game prices have always hovered around the $30-50 range, and often more. $50 for a game that can give you dozens of hours of entertainment is not highway robbery. Am I supposed to feel sympathy for those poor warez kids, who just happen to have powerful computers, high speed CDR drives, broadband Internet access, various video game consoles and the leisure time to consume (and pirate) endless amounts of anime, music, video games, Simpsons episodes and whatnot? We're not talking about people stealing food and medicine. Enough said.

Movie Hate: The Incredible Hulk

... or, for me, the Inflatable Bulk.

A quick list.

I hated the actor who played Bruce Banner.

The Bulk looked like he'd wandered in from a video game.

The Gamma Dogs looked even faker than that and were a stupid idea.

Too long. Way too long.

The final fight between the Bulk and his transforming father-monster was shot so darkly that you could barely make out what was going on. People in the theater were looking at each other and whispering, "Is something wrong with the film?" (Yes, I know it was a night scene... still, what a directorial cock-up. Pure error.)

Where did this scheming bio-tech corporate yuppie come from - you know, the one wanting to "patent" the Bulk's genetics? Were the writers trying to "soften" General Ross away from being the Bulk's fanatical foe through showing someone even worse? It didn't work. It sucked

What kind of asstacular stupidity was that "final visit" scene between the Banners? Ok, the film sets up that Dad wants one final visit with his son. Fine. But would the dozens of soldiers and cops just stand there while the elder Banner rants about power and destruction and overthrowing the government? Would they just let him scream abuse at his son, when Bruce's rage is the very thing they're trying to prevent?

Even in a normal prison with a normal inmate, that kind of thing would result in large corrections officers shouting "VISIT'S OVER!" and hustling you out of the room. This is beyond bad writing. This is what critic Roger Ebert called an Idiot Plot (wherein people have to be idiots to advance the plot).

Why I Am Not Religious

I find all religious systems objectionable. There's no convincing proof and the closer you examine their historical roots, the more obvious their human concoction becomes.

I suspect we humans made all the religions up because life has three unavoidable aspects that just, well... suck. These three things are the S.U.M. of human unhappiness:

Suffering

Into a every life a little shit must fall. Maybe you'll get a little, maybe a lot. Some people drown in it. Whatever the case, it will happen. You will suffer.

Uncertainty

You don't know what's going to hit you next. You never will, and there's not a thing you can do about it. Life is full of surprises, they say, and most surprises are bad.

Mortality

Even if you do manage a relatively pain-free life, with a minimum of bad surprises, in the end you get the same parting gift as the rest of us: You get to die. In fact, there's a high likelihood that your death will be preceded by years of physical and mental decay.

Now, keep the S.U.M. in mind and check out the world's religions, past and present. Notice how they seem tailor-made to counter the despair of the three things listed above? This is no accident.

Suffering? Sure, you'll suffer - but it's okay. There's a reason. And Something knows the reason and keeps a handle on it.

Uncertainty? Yes, there is uncertainty - but it's okay, because Something knows what's going on, and what is to come.

Mortality? Well, you are mortal - but it's okay, because Something will not let you just cease to be. Oh no, there's more to come...

We are unhappy and we want to be happy. So we invent ways to make ourselves happy and soothe our troubled minds. That's it, folks. From a caveman invoking the spirit of the saber-toothed tiger for a good hunt, to the most intellectually sophisticated modern theologian taking existential comfort from an abstracted Creator, that's all religion is: people telling themselves, "It's not as bad as it seems."

Naming Your Sons

Boys should have manly names that lend themselves easily to short nicknames, like "Mike," "Bill," "Joe" and "Alex." Avoid showoff literary or mythological references - "Heathcliff," "Castor," "Roland," etc. Just because you went into decades of debt in student loans to sit around for four years at a pricey college and read books is no reason to take it out on your child.

Don't name a boy "Sean" or "Connor" if the most Celtic thing in his life is Irish Spring.

Olde Englishe names are also stupid. I mean, "Percival?" People who do this should be hitte wyth plankes.

If you're an 'ethnic' White, use English spellings. "Henry," not "Heinrich." "Steven," not "Etienne." "Charles," not "Karl."

If you're Black and want to Keep It Real, at least talk to someone who actually speaks an African language before you name your son after the Swahili word for fencepost. Also, avoid deliberately misspelling the name - Daymon, Danyel etc. - because you think it makes him "special." It doesn't. It makes him look like his parents can't spell.

If you do want your son to feel special, do not name him exactly after his father and then "Jr" or Roman-Numeral him. A 10 year old kid named "Richard Theodore Warburton IV" might as well have HIT ME branded on his forehead as far as his peers are concerned.

As for religious names - all Old Testament names are grandfathered in, as are Saints' names if you're Catholic, but explicitly religious names should be otherwise avoided unless you're a devotee of the religion and the religion endorses such things. Whatever the case, anyone naming his son "Siddhartha Williams" should hear the sound of one hand slapping him.

Retro Lament

The recent purchase of a used Star Control 2 game has got me in Retro Mode. By "retro," I don't mean using a hotrod gaming PC powerful enough to control a Space Shuttle launch to play Choplifter. I mean "Retro" in the sense of quality enduring through time.

It really frosts my flakes to consider how many of the truly great games of yesteryear - real classics like M.U.L.E., Star Control 2, Archon, Starflight, the better Ultimas, etc. - have vanished into the mists for most gamers.

Vanished is certainly the term. We're not talking about the dusty bottom shelf of the local EB or overlooked stacks of $10 jewel cases. Many of the classics are practically unobtainable.

Take M.U.L.E. Published in 1983 for the Atari 800 (and later ported to the Commodore 64), this excellent game never appeared on either the PC or Mac. I'd wager that only about 2-3/100 of current gamers have even heard of it. We're not talking about an average title, either. M.U.L.E. is one of the best computer games ever. Yet, at this point, it might as well have never existed.

Even those old enough (or retro-devoted enough) to even know about M.U.L.E. cannot legally play it unless they have a working Atari 8-bit PC or a Commodore 64. Sure, there are some of those still knocking about. But look, I don't care how elite a vintage PC collector you are; these things were not built to last decades.

There is also the fragility of the magnetic game medium itself to consider. The day is coming - and it's not as far off as we would like to think - when a working Atari 800 will be as rare as a Dusenberg. At that point, those who still possess and maintain them will be either outright professional collectors or gaming's version of manuscript-scribbling monks. Either way, they're not going to let you anywhere near their precious antiques.

Well then, we think, thank goodness for emulators. Problem solved, right? I beg to differ. I don't want to start a nasty (and endless) fight over the propriety of emulators and abandonware. All I will say is that the whole thing is a very shaky basis - legally and ethically - to stake the preservation of classic games on.

What's more, we have to appreciate how small the emulator audience is. Hardcore PC and console gamers devoting time and money to keeping older games and tech alive and spending hours (and small fortunes) scouring eBay for mint-in-box copies of Suspended are fine as far as that goes - but it doesn't go very far. Most gamers are oblivious to this. For them, when a platform (or entire generation) crosses the line into 'dead tech,' it's gone.

Aside from making curmudgeonly Ancient Gamers like me feel old, the historical amnesia caused by rapid technological obsolescence has a detrimental effect on game design overall. Even the best games have a very short 'life' - except in the hands of collectors and die-hards - before they are shoved off the stage to make room for the Next Big Thing.

This not only prevents a lot of people from enjoying (or even knowing about) great games like M.U.L.E. but also prevents the formation of a collective, creative memory for the gaming community. For the bulk of current gamers, any titles older than 5 years might as well have never been made. With the classics out of sight and mind, gamers are left with the same old ideas served up again and again because they seem new each time.

I would also argue that this is a factor in gaming's protracted adolescence; more specifically, the fixation with violence, puerile sexuality and "attitude." Having no accessible past, gaming's collective mind is confined to the Perpetual Now. It literally cannot grow up. Perhaps I'm over analyzing this. But consider this: In most creative endeavors, quality and worth are proven over time. With computer games, that which is old is simply... old. Forgotten. The "good" stuff is always what's about to happen, not that which has endured over time and transcended the circumstances of its creation.

What flat-out puzzles me is why more gaming companies don't see what a gold mine of ideas, stories and great gaming hooks reside in these old worthies. How many games have you played recently that were technically / visually dazzling but utterly empty of good writing, interesting characters and that ever-elusive 'fun?'

Why not revisit, say, Starflight? Here's a space exploration game from 1986 that's smart, funny and genuinely involving. A good development team could port this thing in their sleep. All the tricky writing stuff - character design, background story, dialog, surprise ending, you name it - is already there. A splash of new art, update for DirectX support, and you're done. A classic lives again.

When I win the lottery, I'd like to buy the rights to the golden oldies, assemble some good programmers, and re-release them for Windows and the Mac. I'll bet I could do a good job on Sundog for less than what Romero's Ion Storm spent on office furniture.

I, Customer Auxiliary

The other day I was talking with my merchant friend James Katona of Men at Arms Hobbies in Middle Island, NY - where you all should be shopping - who remarked that I had been coming there for so long, I was more than a mere purchaser of goods.


We knocked that around a bit, and concluded that I was one of his Customer Auxiliaries.


We defined Customer Auxiliaries as old(er), longtime shoppers for whom the store in question is as much a clubhouse as a retail business. They know the stock and product lines inside and out. They're almost always on a first-name basis with the staff, and many are revolving-door or seasonal employees (I myself have worked at certain stores to help out during busy times - always for barter or just to be a pal). They are the go-to folks for organizing games tourneys at the shop and other such things.


Being a CA can be a sweet deal; owners appreciate the help and publicity - aside from being rabidly loyal customers, They are always evangelists for shops they frequent - and are rewarded with discounts, set-asides and running "tabs" when they're short of cash (although they do not abuse this consideration).


The downside, if there is one, is a clubhouse mentality can corrode into clannishness and snobbery that both annoys the owner and gets in the way of good customer service. This has happened several times at the comic shop I frequent, where I've had to drag people aside for a little chat on proper etiquette when dealing with those newer to the hobby than one's self.


I choose to be a Customer Auxiliary rather than work at these places, because I have a (better-paying) job already. But even if employment IS what you seek, consider the CA route. Be a good one and I guarantee you will be noticed.


Fair warning, though: you might be looking at a significant time expenditure, and it's a true labor of love. You can't fake CAdom and you can't insist on it - it just happens.


One more thing... in my almost-30 years of hobbydom, I have never met a female CA. Not once.


Interesting, that.

Tech Support Sucks

It's been my experience that calling tech support is an exercise in futility. There seem to be only a handful of people at any given "support" center who actually know what they're doing, and good luck getting one of them on the phone.

I cannot recall how many times I've been given troubleshooting suggestions that were either totally irrelevant or just plain wrong.

Nothing makes my day like having a computer problem, calling tech support long distance on my phone bill, waiting half an hour - or longer - to talk with someone and then realizing I know more than he does.

What people need to remember is that these support centers are often run by a contracted company and not Dell, HP or whoever and get paid less for a return call on the same 'incident' than a new call for a new problem.

They therefore have the explicit objective to take as many new calls as possible and then dispose of them as soon as possible; the old "churn and burn." The last thing they want to do is to take the time and effort to seriously and methodically troubleshoot your technical problem.

This is why so much tech support "advice" comes down to R & R (Reformat drives & Reinstall software). They know this will take a long time and force you to get off the phone so they can take (and bill) the next call. Also, neutron bomb measures like R & R probably will solve most problems, so you won't be calling back (even though, if you think about it, you never did learn what was wrong or how to avoid it in the future).

Now, increasingly, we have the problem of tech support being farmed out to non-English speaking countries. On a recent call, I spoke to Vishnapranyahursadanapol (I think) who seemed like very nice guy - SEEMED like, I say, because I literally couldn't understand one thing he said.

I have an ear for languages and I'm VERY comfortable with accents, folks. If I can't understand someone, then he just cannot speak clear English. Period.

An Internet Forum Bestiary

Here are some personalities (perhaps "poses" would be a better word) that I am tired of seeing online.

The Rebels

Rebels think themselves wise or good simply because they take a position contrary to whatever the majority of a given group thinks about something, when their thinking is nothing but the photo-negative of conformity. Look for this whenever someone tells you he's a "free thinker," "alternative" or whatever. Remember: Even if you do march to a different drummer, you're still marching.

The Instantly Cool

ICs imagine that coolness (733tn3zz, "cred," etc.) is something that can purchased or otherwise acquired from some third party or by aping the beliefs, dress, manner or lifestyle of some subculture. It has apparently never occured to them that the minute, I mean the very second, you consciously mold yourself into something that is not authentically you in pursuit of approval from others, coolness is gone. A costume is a costume, whether it's a grey flannel suit or the pierced/punk thrift store look.

The Xenophiles

Xenos admire something in direct proportion to how "different" it is to what they're used to, reflecting very little (or not at all) on whether or not it's actually any good, or even if they actually like it. They are, as W. S. Gilbert put it in The Mikado: "The idiot who praises, in enthusiastic tone Every century but this, and every country but his own..."

The Internet Nostalgists

INs wax on endlessly about the "good old days" before the Internet "went capitalist" and "sold out" by allowing commercial activity. To hear these people (old, bitter Unix snobs, mostly) talk about this supposed Golden Online Age, you'd think it was some utopia of Cool, Tech-Savvy people doing Smart and Clever Things - sort of a cross between the Bloomsbury group and Xerox PARC. Well, I was there, in college in the late '80s; the pre-Web, pre-spam, pre-public days of the NSF/Internet - which was supposedly so exalted over the hoi polloi and their vulgar wasteland of dial-in BBSes, GEnie, etc. And folks... then as now, the 'Net community was a very mixed bag. All the personal and social pathologies were already evident.

The Politically Incorrect

PIs are a subset of Rebels; people who think that hypersensitive "PC" sanctimony on the part of others authorizes or even requires them to talk like actual racists, misogynists, gay bashers, etc. (I'm not denying the blight of "PC" on American culture; it's all too real. But we're not going to get past it by confirming the worst fears of the "PC" more-sensitive-than-thou types, even if we are kidding as we do it.) "Look! I'm being gratuitously offensive! People are, like, UPSET with me! I'm a RADICAL!"

The Pseudo-Illuminati

The Pseudos are obsessed with having the Inside Track on something - Hell, anything - because it makes them feel "special." This manifests itself in two principal ways:

1) Obscurantism: The more marginal and "weird" something is, the better. The more people who know about it, the more it should be shunned. (Reference the earlier Xenophiles, as well as the frequently-observed phenomenon of people hating any music band that gains an appreciable audience.)

2) Arrogant Exclusion: How many of us have run into these "If you don't know, I'm not going to tell you" types? Nothing makes them happier than to litter their conversations with allusions to books, movies or songs that few people have heard of. Yet if you ask, "That sounds interesting, where can I check that out?" they either go stone silent or change the subject.

The reason, of course, is the same one behind calling them Pseudo-Illuminati; they're not really In The Know at all. They've never made any original aesthetic or intellectual discoveries... they got it all second-hand. Someone else told them about this book, that movie or some odd style of music, which means - GASP! - that they, too, were once Out Of The Loop.

Honestly, I don't know who they think they're fooling. We all learn about pretty much everything second-hand. Take a song, for example - unless you wrote the damned thing yourself, you heard about it from others who got there first. "No biggie," as we used to say. Ah, but for the Pseudos, it is a biggie because it reveals them to be just like the rest of us rather than Ascended Masters of Hipness. They would rather die than admit there was a point in their lives - perhaps quite recent - when they had no idea who Warren Zevon was, had never seen a David Lynch movie and thought H. R. Giger made wristwatches.

At the Movies

At what point, exactly, did people become incapable of distinguishing public movie theaters from their own living rooms?

The trash. The cell phones and pagers. The non-stop talking. And just try to get the theater staff to do something about it. I feel compelled to point out that as a paying theater customer, I should not have to police the goddamned place.

This sorry state of affairs has two causes:

1) Theater owners simply don't care if you have a lousy time, as long as no one makes any noise about it. Once they've got your $9 they could care less about your movie-watching experience. Multiplex theaters are little better than human cattle yards; get 'em lined up, sell 'em snacks, stuff 'em in the seats for 90 mins and then move 'em out to make space for the next herd.

2) People are getting worse. There is almost no such thing as a well-mannered group anymore. I'm not saying we all have to love each other, but apparently the idea that we should not harass or interfere with each other has also gone by the boards.

Whenever I confront people in theaters and ask them to keep it quiet or control their children - I'm always civil - they respond with undisguised shock; the very IDEA of a stranger asking them to control their actions or exercise any level of awareness and consideration of others in the theater leaves them sputtering with self-righteous anger.

Personally, I blame the 1960s. Why? Well, the 60s established several toxic ideas, still with us today:

There is something unhealthy and oppressive about self-restraint and public manners.

Young people are wiser, more ethical and just plain BETTER than older people.

Being "old fashioned" or a "square" is worse than being the most antisocial libertine.

Now, some of these ideas are centuries old - Rousseau's "les hommes naturel," etc. - but it was in the 60s that they took control of the American culture and public mind, and we are seeing the results... at the movies.

My Computers (A Work In Progress)

2003 - Homebrew AthlonXP @ 2 Ghz

2002 - Homebrew AthlonXP @ 1.0 Ghz

1998 - Apple iMac @ 233 Mhz

1997 - Gateway Pentium 2 @ 400 Mhz

1996 - Homebrew Pentium 1 @ I forget

1994 - Homebrew 486 @ 80 Mhz

1992 - FastMicro 486 @ 33 Mhz

1987 - Apple IIgs

1986 - Apple IIc

1983 - Apple IIe

1980 - Atari 800


Notes:

1. Technically, the Apple IIgs was my college roommate's machine. I think it was, byte for buck, the finest computer Apple had ever produced up to that time.


2. The Homebrews were constantly mutating, which is why I only gave the CPU speed (Although I know how to do it, I am NOT comfortable swapping CPUs. When the time comes, I get a whole new mobo+CPU preinstalled and move the drives).


3. I loathed the IBM PC series. While I recognized their importance to the "computer biz," they were crude, ugly and badly designed. Among my geek circle, it was a point of honor NOT to have an IBM PC.


4. I remained an Apple fan up to the IIgs. The Macs impressed me, but they were so damned expensive! When I went looking for a new computer late in 1991, I had the choice of spending $1800 on the FastMicro (i486 / 8 MB RAM, 200 MB HDD) or something like four grand (!) for a Mac of equivalent power and capacity. I knew it was all over. Like a lot of people, Apple just priced me out of their market. So I bought my first PC. Later, though, I bought an iMac out of pure curiousity, and enjoyed it.

Why Windows Won

The other day, a couple of Windozing friends asked me, in effect, "If the Mac was/is so great, what happened? Why did Windows win?"

Well...

Windows only really began to matter when version 3.0 was released in 1990. Before that, it was important, but hardly a standard. With 3.0, people finally had a good, capable GUI for the DOS + Intel platform. Combined with ever faster and cheaper Intel processors and the Clone Wars (which began when Compaq slashed prices across the board to prevent upstarts like Gateway 2000 doing to them what Compaq itself had done to IBM), users could afford "byte for the buck" computing power that would have been unbelievable only a few years before.

After IBM shot itself in the head with the PS2/OS2 debacle, the "leadership" of the PC industry fell to the Clonemakers, who then went at each other tooth and claw, creating a frenzy of price/performance competition that remains with us today.

I've been using Windows since version 2.0. Nobody I know has ever actually considered it superior to the Amiga, Mac, Unix, NeXTStep, or what have you...

I think Windows triumphed for these reasons:

1) The only REAL competition, Apple (sorry, Amiga fans), took itself out of the running by sticking to a controlled, proprietary architecture and moving the Macs "upscale" when Intel computers were becoming more affordable by the day.

2) MS-DOS. Yes, I know. I didn't like it either. But for vast numbers of people, Microsoft's command-line mediocrity WAS computing (Microsoft didn't even invent it, but that's another story). Since Windows worked "on top of" DOS, you could have a GUI and not have to buy a new computer - hell, you could even keep your old DOS software and run it outside of the Windows environment.

3) OS/2. The worst fiasco in the history of software, it took IBM off the playing field for years. Read Paul Carroll's book BIG BLUES to learn the whole sorry tale. So, let's look at Microsoft in the early '90s. Their two chief GUI competitors are Apple and IBM. Apple is playing a whole different game, and IBM has torpedoed its own ship. MS-DOS, their existing OS, runs the vast majority of the world's personal computers. And their GUI, Windows, has - at version 3 - finally found its legs.

Windows inched towards the Macintosh GUI for years. With 3.0, it was more than apparent. Certainly Apple thought so, because they promptly sued Microsoft for it - and lost after years of drawn-out courtroom wrangling. Bill Gates (and his counterparts at Intel) placed their bets on the idea that if your competition makes a gorgeous machine that costs a fortune, and you make an adequate machine that is affordable to multitudes of people, you will win.

People were willing to pay a premium for the Mac's capabilities in the days of DOS and early Windows, when it was the only good GUI show in town. But as Windows improved and prices dropped to the point where a capable Wintel computer cost less than HALF of what a medium-level Mac did, it became harder and harder to justify spending the price of a good used car on a Macintosh.

In the office environment, the situation was even worse. With the exception of desktop publishing, businesses had no need for a GUI, and they sure as hell weren't going to pay what Apple was asking. On top of that was the ascension of Novell Netware to the LAN OS throne - with a command-line interface straight out of DOS. The Macintosh and GUIs in general were irrelevant to Netware. You could configure Macs for Netware's IPX protocol and attach them to Netware LANs, if you wanted to. But that's the crux of it. Why WOULD you have wanted to?

Apart from aesthetics, why would the average 1990 business user buy a Mac when they could get a Wintel (or even DOS) box that did what they needed a computer to do - for a fraction of the Mac's price, ran more software, was infinitely configurable, and on top of that, was by that time firmly established as the industry standard?

Some argue that, were it not for the killer Mac+PageMaker+LaserWriter combination opening a whole new personal computing arena (desktop publishing), Apple would have been ground into hamburger in the late '80s. I wouldn't go that far, but it's sobering to remember that the Mac, originally, was not a sales hit. Only with DTP - the Mac's killer app, so to speak - did Macs really begin to move.