Thursday, February 21, 2008

IBM, the PS/2 and Why PCs are as they are...

Herewith an IM exchange from a few years back in which I opine as to when, and how, IBM lost control of PC design and standards. "Tristram" is Darren, a Macolyte par excellence.

Myself: Greetings.

Tristram: How goes it?

Myself: I am having new iMac hallucinations.

Tristram: As am I.. however, my need to keep my finances in some semblance of order has quelled the uprising. I DO think the thing will sell at a solid rate,
however.

Myself: Oh hell yes.

Myself: And I am angrier than ever at the so-called computer makes on the PC side. Looking at the iLamp, I mean, iMac, one realizes how little design and engineering (and thought of any kind) goes into PCs. They really have become mere office supplies, like staplers and fax machines and the like.

Tristram: Pretty damn close thereto, that's true. Function far outstrips form. But for so many PC users, that's just fine.

Myself: All too true. If ya think about it, we really shouldn't speak of "Dell computers" or "Compaq computers" apart from servers or high-end workstations (which actually are designed from the ground up). Apart from those specialty machines, most "brand" PCs are nothing of the kind. They are wholly interchangeable. Someone else makes the mobo, the chip, the drives, etc. Dell or whoever just assembles them (badly).

Tristram: Well, Dell is actually said to be fairly competent. I don't know about the individual MAN, Dell, but they say his company puts together a fairly passable unit.

Myself: Botheration. However, I think Steve is onto something. Apple has to be, well, different. They can't win the "office equipment" battle, because of issues of cost and scale that are - at least for now - insurmountable.

Myself: When I show my PC pals the new iMac pictures, they openly smile. "That's so....great..." is the usual response. And these are longtime, cynical computer guys.

Tristram: Wow. That's amazing to hear... and what did they think of the original iMac three years ago, or the Cube a year and a half ago, just out of curiosity?

Myself: They disliked the lack of a floppy drive and no onboard SCSI port - which even I had to admit was a very fair criticism - a lot of these guys had expensive SCSI devices, and how much would it have cost, really, to slap a 25-pin SCSI-2 connection on the side, next to the USB ports? They loved the iMac's design, though.

Tristram: Well, that is comforting. So they are able to discern what is good and what is chaff.

Myself: As for the Cube, they raved about the design but thought it was pricey (esp sans monitor) and wondered about its market prospects. One said, "You can't have design for design's sake. The Cube should win all sorts of awards... but will it SELL?"

Tristram: But in neither case did they scoff or laugh at the bold designs.

Myself: No. They, like I, love design. We're all ID disciples. The highest achievement of any tool is transparency; do the job without getting in the way of the job.

Tristram: I agree.. thus the appeal of the entire Mac solution.

Myself: Agreed. Although, of course, versatility imposes complication. No way 'round that. But what I like about Apple is that they keep plugging.

Tristram: Yes.. that's definitely the case.. as these machines become more capable, they become increasingly more complex and unstable.

Tristram: Until the advent of the "Modern OSes" that we now have, where a best-of-all-worlds situation can (theoretically) exist. We'll see.

Myself: Right. PCs are horrible in this regard, although WinXP is a big step up. But we must remember that PC development was marketing driven once IBM lost any kind of control (circa 1986-87).

Myself: From that point, security and stability didn't matter. Make 'em cheap, build a lot of them, and sell as many as possible. Oh - and maintain backward compatibility at all costs, as a selling point so folks won't have to repurchase their software libraries when they upgrade. Marketing, marketing, marketing.

Tristram: Right. To that end, meet the parallel port.

Myself: The tech-heads at the OEMs knew then, and know how, how shoddy most of the design was, and how screwy the software was. But their hands were tied. They loudly complained, but they weren't in charge.

Myself: I consider the 86-87 period a watershed in PC tech and system design, not the intro of the AT platform in 84 or the 386 or even the price collapse engineered by Compaq in 1991/2. I have a strange reason, but I think it holds up.

Tristram: Do tell. Most less-informed tech-heads would put the focus on a later time, perhaps, or earlier, I would think. But I'm sure you have a cogent rationale behind this.

Myself: Ok, here it is: in 1986, IBM realized they had utterly lost control of the PC. Dirt-cheap, 100% PC-compatible mobos were arriving from the East (Taiwan, esp), Intel would sell the chips to anyone, and - of course - Microsoft couldn't license DOS fast enough.

Myself: Because IBM had used off-the-shelf parts to make the PC (for reasons of cost and speed of design back when they first rolled it out), they had no control at all over the hardware. The only real "IBM" part was the BIOS, which Compaq cracked -followed by many others. So, what to do?

Myself: IBM's answer was: when you're losing the game, change the rules.

Tristram: Most Kirk-like of them.

Myself: (Trivia aside: IBM's cost per unit for the original PCs was VERY low. They actually ran a smooth, lean operation as far as PCs went. The problem was the enormous, and I mean enormous, overhead of the rest of IBM. But IBM could - and did - build the actual PCs cheaply as Compaq or anyone else did.)

Myself: Anyway... IBM decided to "restart" the PC market with a new series of machines. These were IBM all the way, designed in-house and proprietary (and patented) from snout to tail. No off-the-shelf serial ports or third-party chips here.

Myself: They secured exclusive licenses from Intel and fixed the machines so they would only run IBM's PC-DOS, not the general Microsoft MS-DOS. In order to shoot the aftermarket card makers through the head, IBM redesigned the mobo and introduced MicroChannel 32-bit architecture.

Myself: IBM, in other words, was going out of it's way to make a computer that was NOT compatible with the original PC/XT/AT series.

Tristram: Right. Interesting. A whole new beast that ran their OS well... Mac-like in the sense that they controlled the complete product.

Myself: Exactly. Take THAT, clone-makers! IBM also planned to roll out OS/2 as the preloaded OS - but most of them wound up running PC-DOS because OS/2 wouldn't be ready for several more years. (The OS/2 fiasco deserves its own conversation...)

Myself: So, they called these new machines Personal System 2 (PS2) and rolled them out in 1987 with massive publicity. They even offered a "cloning license", for a steep price, you could start an OEM company and build specialty PS2s (license revocable at will by IBM).

Myself: Some of the PS2s were marvelous. The PS2/80, as I recall, could be disassembled like Lego bricks for service. The PS2s introduced the 3.5 diskette, VGA graphics, the now-familiar PS2-style mouse and keyboard connectors, and a host of other things. There was some solid design behind these things.

Tristram: I remember the PS/2 concept, but I didn't know what was behind it all.

Myself: But they were expensive, proprietary and the performance results were mixed. Had the things been barnburners, the computer world might be very different. But they weren't. Your average Compaq DeskPro 386/20 could match them, and cost 1/3 less, and was - and was industry standard for expansion boards and drives. And that was the killer.

Myself: For the first time in PC-dom, the term "industry standard" not only did not refer to what IBM was doing - it was opposed to what IBM was doing. IBM had actually "Apple'd" itself into opposing the Intel/AT mobo/MS-DOS platform juggernaut; they were knocked flat.

Tristram: Hmm. Interesting in the extreme. Amazing that these culled-together machines, these bastardized Frankenstein's monsters of the computing world, could actually DO anything with something approaching, and even exceeding the aplomb of these finely-tuned machines.

Myself: The kiss of death came when it was discovered that the PS2 series had trouble running versions of Lotus 1-2-3 and Flight Simulator (these were the "reference" software programs used to gauge PC-compatibility) because of video issues.

Tristram: Or Apple's equally-finely-tuned machines.

Myself: Yup. And the PS2's fate was sealed. A number of PC-clone makers formed the so-called Group of Eight and publicly swore they would have nothing to do with the PS2 licensing, MicroChannel or any of it. Shortly afterward, they created a 32bit "new" - but backward compatible - daughtercard and slot design called "EISA."

Myself: IBM just fell apart after that, and would not recover until the mid-90s. A fair number of PS2s were sold to the government and large businesses (no surprise there, eh?) but as far as PC-land went, IBM became irrelevant.

Tristram: Hmph. Well... that's fascinating stuff. I suppose the combined might of a thousand companies pursuing performance and compatibility from a hundred different angles was enough to overcome the singular excellence of IBM.

Myself: Right. Exactly. And that's the problem. With IBM off the throne, leadership passed to... nobody. The only 'leadership' was a committee of hundreds (the cloners, Intel and Microsoft, and the 3rd party software and hardware makers) whose only possible collective decision was this:
    Don't have a repeat of the PS2. Don't take chances. No radical design advances. Maintain compatibility at all costs.


Myself: And we've been there, more or less, ever since. Thus endeth the story. Hope it wasn't too boring.

Tristram: Not at all. Very, very interesting. Though the end result is pretty boring. Thankfully, software design hasn't been quite so stagnant through it all.

Myself: No, thank God for software. And, to be fair, there is something to be said for enduring standards. Alas, I don't see any way out of it. The PC market is so massive that advances are slow, grudging and troublesome.

Tristram: Right. The question becomes - when will the industry MANDATE a change? When will the needs of the customer dictate that something needs to change?

Myself: The "cool" stuff that happens seems to occur only when the expanding PC market bumps into something that already exists. Internet, digital video and audio... these things existed "outside" the PC before you could do them ON a PC. Which is why I think Apple has got it right. Computers are no longer some enclosed world of apps and games and printers and the like... They have to "mesh" with consumer electronics and other communication technologies.

Myself: And the thing about Apple is, they seem to really care about "meshing" properly. At the recent Keynote, Jobs mentioned what a pain in the ass digital photos can be, with different apps to capture, edit and organize photos.

Tristram: Right. Such is the case in many fields of digital lifestyling.

Myself: This is so true. Everyone knows it. But the PC folks won't move on it because - God forbid - that would mean presenting a new standard for integrating these tasks and taking a risk. The PC attitude is one of resignation. "Yeah, it's a pain, but it works, more or less." I'm guilty of this myself.

Myself: The Steve said another interesting thing: we would find it absurd to read, but not want to (or be able to) write. Yet in this video age, we are so much all spectators, but not creators. Dammit, cameras and good editing software are essential to our time. Why NOT have a zillion people making their own movies? Lots of folks have stories to tell. Okay, most of it will be crap - but so what? They’re just electrons. Erase it and try again.

Tristram: So true. The key, however, becomes informing the masses that this is something that can be done, should be done, and is now available. And to let them know WHERE it's at, before the PC world stumbles into something approaching parity.

Myself: Right. And call me an elitist, but quality costs money. Macs are on the balance pricier than PCs because of market share, economy of scale, etc. So be it. But you still get one HELL of a computer for the money, and the excellence of design and ease of use DOES make a difference. And at $1299, the new iMac entry-level is a bargain.

Tristram: Longevity counts, as well.

Myself: Yes, that, too. Buying a Mac is like buying a car. Three years' on, it will still be there.

Myself: Speaking of: after the "Megahertz Myth," Apple needs to take head-on the "upgrade" myth. Most people - the solid majority - should NOT have to buy a new computer every 18 mos or two years. That is, pardon my Swahili, bullshit.

Myself: Yes, some will always need the "latest and greatest." But for most folks, the iMac you buy now should serve you faithfully for years. This whole despicable scheme of building (and buying) crap computers because, hell, in 12 months I'm just going to toss it anyway and get a new one, has got to go. This was marketing brainwashing by companies that based their business models on massive volume sales of soft/hardware and therefore had to convince us there was something "wrong" with us if our machines were older than 2 years.

Myself: My 3 year old iMac runs like a clock. Only thing I ever did was add RAM.

Tristram: True enough. The built-in obsolescence thing has been eaten up by the buying public. They really believe in it. Of course, on the PC side, sometimes the two-year obsolescence myth holds true!

Myself: "Oh no, the new 10Ghz MetaPentium liquid-nitro-cooled PCs are out with particle-accelerated video cards and 20 GB of RAM! I must have one! How can I run Word and get email without it! Where's my credit card?!"

Tristram: Not to mention the old couple that came into the Hauppauge Apple store in mid-November, purchased a Dual-800 G4 with not one, but TWO Cinema Displays and an additional video card to drive the second beast, all expressly meant for... you guessed it... word processing and Internet surfing.

Myself: Wow. Maybe with that rig, they can get RealPlayer to work.

Tristram: Hey - let's not predict wildly here.

Myself: True, true. Speaking of, I wish Apple would release these Keynotes and other QuickTime goodies as downloadable files (for those of us with broadband). No matter what I try, they always look like crap.

Tristram: Same here. That WOULD be nice, yes....

Myself: Akamai servers, my butt. In other exciting news: I got x-rayed this morning, and after one of the shots, I hear the tech (behind the screen) say, "Oh, crap." File that under
    Things You Don't Want To Hear.
It turned out his pen was out of ink.

Tristram: Most definitely! Do you think you'll make it?

Myself: If I use a different pen, yes.

Apple's Dilemma II (2006)

While I am a Mac fan, I have little patience for overheated rhetoric about Apple vs Microsoft as some kind of sacred struggle between Vision and Compromise. I mean, really.

Apple, at this point, has little choice but to emphasize design and the effective integration enabled by their proprietary platforms because it's all they have left. What else are you going to sell the Mac on? Price/performance? The software library? Please.

I think the Mac advocates' strongest points are these:

1) The open architecture of the PC is a mixed blessing. Yes, you can customize and modify PCs in every possible way, but this also opens the doors for every kind of functional and compatibility problem. We PCers take a perverse pride in our tinkering abilities, but most of us have to admit that we are "technical" by necessity, not desire, because the PC platform is a house of cards compared to the Mac - although things are a lot better now that under Win3.1/95/98.

2) The PC architecture serves a "lowest common denominator" interest far more than the Mac's. "Backward compatibility at all costs" is almost a design rule with PCs - and woe to those who truly try to innovate or improve. Not even IBM could get away with it, when they tried to "bump up" the PC to the next level with the PS/2 architecture, control of the industry was ripped from their hands by the Cloners; a committee of companies led by Compaq dedicated to s - l - o - w progress, if any, in how computers are used.

(Look at how long ancient DOS-code persisted in Windows 95 and 98, at the cost of terrible instability and insecurity in those OSes. Think that was an accident? Lazy programmers? Nope. Legacy software and hardware required it.)

We can't be too harsh, though. Companies that spend literally thousands or tens of thousands on hardware and software do not want vendors telling them every 18 months or so, "Ok, everyone out of the pool. We're changing everything." Ironically, Apple's small market share actually frees them to take chances that no other OEM would dare. Look at the way Apple TWICE successfully migrated their entire business across hardware (68k -> PowerPC) and OS (MacOS -> OSX) lines. I have a hard time seeing a PC cloner doing that - assuming they'd have the nerve to even try.

3) Gear-savvy PC folks who come into discussions like this - including me - with their scratch-built, ultra-cheap PCs and hands-on knowhow, are not typical of computer users. Computer users are getting less 'techie' with every passing year. People can debate the desirability of this, but it's a fact. The fact that you can "roll your own" PC means very little to most PC users. For them, the quality of goods and ease of use presented by the Mac are highly compelling.

Side note: What kills the deal, as I said before, is the PRICE. Even if they can't do the homebrew thing, they know people who can. I have built a dozen PCs for friends and coworkers in the past few years and easily half of them were considering a Mac. They just can't look past the price. You can lecture them all you want about interface elegance and iTunes and whatnot, but they see giving me $1000 to build them a good PC vs spending $1300 for the cheapest iMac or $2500 for the cheapest PowerMac.

4) The "software gap" has everything to do with market share and nothing to do with the technical merits of the PC architecture. Half-Life would have run just fine on the Mac, but wasn't released for business reasons. Similarly, cutting-edge hardware folks (like those 3d accelerator makers) roll their stuff out on the PC first simply because there are more of them, and they'll sell more cards.

5) Microsoft's power has got to be reduced. I don't begrudge them the credit they are due. They've done very good work and - for the most part - earned their success. (Thanks, Steve.)

But Microsoft has reached a point in their corporate growth where innovation of any kind takes a distant second place to smashing all potential competition - not by making and selling better stuff in a "May the best man win" way, but through anti-competitive (and illegal) measures. This was the conclusion of the Court after an exhaustive investigation (and was common knowledge in the computer and software industry before the trial).

There is also the little fact that they were found guilty in court of breaking the law to maintain their monopoly - during which case the company was caught presenting faked 'evidence' regarding certain aspects of Windows and IE.

Of course, the actual punishment was a joke. Proof again that if you're big and rich enough, it can pretty much ignore the law.

Microsoft's current goal - which the management of the company makes no secret of - is to remove control of the desktop operating system from the end user.

This will be accomplished in two ways:

First, the gradual shift from a 'retail purchase' to a 'subscription' model. You won't just buy Windows once for your PC, you'll rent it and pay and pay and pay...

This allows Microsoft to keep the Windows "cash cow" going. They knew all along that high volume OEM and retail OS sales and upgrades wouldn't last forever, that at some point most homes and offices wouldn't be buying new computers as often as they were in the late 80s - early 90s. The subscription model is their way of making you buy Windows all over again at regular intervals, whether you need to or not.

Second, changes in the Windows OS architecture and functionality will make it increasingly difficult (if not impossible) to run Windows off-line as a stand-alone product. Your PC will become less a PERSONAL computer and more of a USER TERMINAL dependent on centralized data services.

This is the real motivation behind their enthusiasm for a wired world and increased deployment of broadband Internet access, as well the new Windows activation scheme.

All this would be less worrisome if Microsoft wasn't the only show in town, but it is. Even J. P. Morgan and Standard Oil never enjoyed the level of market control that Redmond now holds.

Microsoft is not evil. It's simply a company trying to protect and maximize its revenue sources. I understand that. But there are some fights it must not be allowed to win. Apple is key to this. But for Apple to make a real difference, it needs to move beyond producing boutique computers for the Pottery Barn set and take a real bite out of Windows' market.

Apple's Dilemma I (2006)

Let's talk money.

What kills the Mac for a lot of people - regardless of what an ID showcase it becomes - is the sticker shock.

Forget the Mac Mini and iMac for a moment and look at the Mac Pro - you know, Apple's REAL desktop computer.

The entry-level model is $2499.00 without a monitor. Add $599 for Apple's entry-level monitor, the 20" Cinema Display, and we've hit $3100.00 without any other purchases. Three thousand dollars - for Apple's entry-level equipment.

Sure, it's a nicely put together and the included software is very impressive. But for three grand I could put together an Athlon or Pentium machine fast enough to travel through time... AND have my software options increase tenfold, if not more.

What amazes me is how successful Apple has been in convincing its loyal customers that it is David in the struggle vs Goliath (IBM initially, then Microsoft) when - as others have pointed out - Apple is the North Korea of computing environments.

The 'closed' nature of the Mac, then and now, is no accident. Many books about Apple and admissions from its own ranks have made it clear that Apple uses the Mac's proprietary technology to maintain hardware profits as much as for any kind of "quality control" over the "entire user experience," as Mr. Jobs recently said.

Historically, Apple became accustomed - one might say, addicted - to making money off the Mac's hardware exclusivity to let it go even if it meant long-term growth in the market share and user base for the Mac OS.

Since this also meant that Apple had to eat almost every dime of R&D for the Mac, it could not profitably license the Mac OS to "cloners" because they, not encumbered by Apple's overhead, would cut Apple's throat in a price war unless Apple "taxed" them heavily for each machine sold, in which case, the clones become nearly as costly as Apple-branded Macs, in which case, WHAT'S THE DAMNED POINT?!

(Indeed, this is exactly what happened with PowerComputing and others when Apple DID briefly licensed the Mac OS.)

This double-edged sword of hardware control is also why none of the many "port the Mac OS to the (name of chip)" projects in the 80s and 90s ever came to anything.

There has been progress, to be sure. Apple has increasingly made its peace with "PC" technical standards in things like monitor plugs, hard drives, and RAM modules. ADB is gone. SCSI is gone (mostly).

The "killer app" for the Mac OS is the OS itself and always has been. THAT is the 'Mac Experience,' not a MOMA case design. Apple must, in the end, give up its fetish for hardware control if it ever wants to be more than a marginal player, and must get the price of entry for the Mac OS platform down to a competitive level with Wintel computers. Unless that happens, nothing else will matter.

The bottom line for me is: Apple builds clever and pretty computers that cost about twice that of the competition's machines of equivalent power and have a fraction of the software support.

My price comparison concerns the PowerMac G4 for a reason: because the iMac is a closed system, it is not a fair comparison to a mainstream PC. This does not mean the iMac or Mac Mini are BAD, or that the quicksilver nature of PC technology and mutability of PC subsystems are unqualified blessings. They can actually be massive pains in the backside, as I encountered recently when assembling a new Athlon/XP system from Microsoft Approved reference parts that just refused to work (I wound up replacing the mobo).

However, if by "computer" we mean a system where you can make significant changes or upgrades to the key systems (CPU, internal fixed disks, video cards, etc.) then we're not talking about something like the iMac. It and the Mini are great products, but "frozen." It's more like buying a laptop - or even a piece of stereo equipment or a television.

This premium pricing also affects the success and spread of OSX. OSX is GREAT - although its adoption by Apple was an admission that Apple had dug a hole with the old Mac OS that it couldn't escape. But as long as you can only get OSX on such an expensive platform, it will remain marginalized.

We can - broadly - sum up the scene thusly: Apple makes better computers, but the Wintel computers are cheaper, support more software, and are GOOD ENOUGH.

Therein lies Apple's dilemma: its foes can "build up" through continual refinement and enhancement of Windows - and with XP and Vista, they largely have - while Apple cannot "price down" to match them.

If you want a Mac that's not a closed system, you have to start - to START- at damn near $3000. The charm of iMovie burns away pretty quickly in the face of that.

With market share, of course, comes software - or the lack of it. At a certain point, developers will simply not bother to do Mac versions of their mainstay titles. Yeah, we can all run Office. So what? In most specific business applications, the Mac is simply absent. You could easily spec out a new company's entire network - servers, desktops, laptops, network attached storage, etc. - and ignore the Mac entirely.

Yes, even Mac strongholds like PhotoShop work perfectly well on Windows boxes that are - even when you adjust for huge sticks of RAM and the zippiest video cards - cheaper than Macs.

Allow me an example from my own company. We do residential and commercial security system and camera monitoring. There has been a huge shift over the past several years from large, expensive (and finicky) custom-built systems to a smaller "LAN" architecture to automate the handling of video and alarm signals.

The companies that write and maintain the software essential to running these systems - on both the central/server and dispatcher/workstation side - would laugh out loud if asked, "Um, does it run on a Mac?" It's Win2000, XP or (sometimes) Linux. That's it.

No doubt anyone reading this could give you dozens of examples, right off the top of the head, of software markets where the Mac either never existed at all, or has vanished. And, once again, it's not because the Mac technology is bad.

PCs, Macs & Hassles

What is it with this fetish of endless hardware tweaking among PC users? Some so-called high performance PCs remind me of those hyper-modified cars that are showpieces of technology but are hardly even driven because the engineering is so unstable.

Also, I think we have already reached the point of diminishing returns in all but the most demanding software. Sure, PhotoShop eats computers for lunch, but I've still got a Pentium 3 in my office with "only" 512MB of RAM that runs WinXP and MS Office2K like a champ. It also handles email and web surfing just as well as any nitro-cooled, 733t h4xx0r monstrosity you could find.

Sure, I've got a game computer "dragster" but it's a freak machine. I don't do serious work on it and if it blows up, I won't lose anything valuable. (I back up my config files and saved games routinely.)

A computer that is cheap and fast but also unstable and high-maintenance is not a good machine for important work.

Another thought: one of the editors of MacToday (Scott Kelby, I think) once made the point that when you browse the shelves of Mac books, they are almost entirely devoted to actually DOING things with your Mac. That your Mac is working properly is assumed.

The PC library, on the other hand, is chock full of diagnostic, repair and troubleshooting volumes. At least half the PC books at my local Borders' are devoted to the problem of PCs simply not working.

An observant eye will also notice how many of the top-selling PC software titles, year after year, deal with simply keeping the various versions of Windows running or negotiating hardware hassles.

Ask any computer support person how often they've had to respond to emergency calls for Macs constantly crashing or going haywire - even in a Mac-heavy environment. This might account for the MIS community's opposition to Macs; their very existence presumes offices filled with computers that need constant attention... look at a Mac office and, 99% of the time, you'll see some MIS guy reading Wired all day long and a layer of dust over his stack of "emergency recovery" disks.

The Left Hand of Dorkness

I have an Xbox 360 video game console. I am left-handed. This is a problem.



It is a problem because every single video game controller made for this console is designed for right-handed people.



The controller is designed with the assumption that you will use the right thumbstick to look/aim and the left to move. This is fine for right-handers. Left-handers, however, prefer (in some cases, absolutely need) to move with the right stick and look/aim with the left - just as we do with computer mice, pens, brushes, scissors, handguns, etc.



While certain A-level games like Halo 1/2, Gears of War and Call of Duty have what are called "Southpaw" control options, where the triggers and sticks can be switched, many other games do not have anything. In my own Xbox 360 experience so far, Lost Planet, Saints Row, Dead Rising and most Xbox Live Arcade games including Doom and Assault Heroes make no effort to accommodate left-handers.



Add to this the inexcusable fact that nobody makes a left-handed XB360controller; not Microsoft, not MadCatz, not Logitech, not Nyko, not GameStop. This drives me almost to the point of rage; Microsoft is designing every kind of gizmo imaginable to interface with the XB360's wireless support, even one to enable the console controllers to work with Windows, but I can't get a goddamned left-handed controller? Another victory for the PC, I guess.



Indeed, this is an old problem. With the notable exception of Nintendo's Wii, video game console systems have always punished me for being left-handed. I have had to grin and bear (grip and bear?) the lack of left-hand support for 30 years now and I have had enough. Why shouldn't I and the millions of other left-handers have a proper controller? I've actually sent letters and made calls to Microsoft & peripheral makers about this. Their responses ranged from blasé dismissal to incredulity that anyone would even ask for such a thing.



Of course, Microsoft could simply produce (or authorize another company to produce) a proper mouse & keyboard set for the XB360, which would solve this whole left/right handed thing once and for all. Microsoft could also make it mandatory that game developers include alternate control options in their XB360-licensed games. Don't hold your breath.



Well, screw them all. I want some left-handed hardware. I want an actual, for-real Southpaw Controller. If I can't buy one, I'll make one or pay someone to do it for me.



Pursuant to this, I found and downloaded a guide to modifying the XB360 wired controller for left-handed use. Since my using a soldering iron is expressly forbidden by the New York State fire code, I contacted the author and asked if I could pay him to do one for me. It turns out that he gets a lot of these requests, so for $60 (including the cost of the controller) he switched the wires on the triggers, thumbsticks and thumbstick buttons. The ABXY buttons and D-Pad are trickier, so I asked him not to bother.



It works like a charm. I tried it out with XB360 Doom, Saints Row and Dead Rising. All good so far. This is the best $60 I ever spent. Well, except for... uh, never mind.



The fellow's name is Ron Alexander. With his permission, I am posting his email address here:



rondcrasher (at) yahoo (dot) com



(Email written thusly to avoid harvesting by spammers.)



This is a stopgap measure, of course. I still want an official, authorized, professionally-made Southpaw Controller. We left-handers should have them. We've waited long enough.

Monday, February 11, 2008

Animal Rites

You know what PETA, Veganism and the Animal Rights movement strike me as? Ideological one-upsmanship among Progressives. People join them and espouse their rhetoric because it raises their status among the NPR crowd.

Think about it: All good, progressive-thinking men - excuse me, PEOPLE - already embrace the "struggle" against sexism, racism and whatnot. Nothing special there. So why not take it to the next level and extend your Liberation Politics beyond humanity? NOW we're talking.

Why settle for White Guilt when you can lament your entire species?

Anyone can march at some protest for equal rights for gay people, but you're well beyond that. You march for equal rights for NON-people!

Why confine yourself to discussions of Privilege and Patriarchy when you can make deep statements about the oppression caused by most animals not having opposable thumbs?

Also, since the production of cheap and plentiful food - like pretty much everything in the modern world - operates on the vast economies of scale and technological efficiencies made possible by modern industry... you get to be mad at Big Evil Corporations for yet another reason!

Finally - and this is really sweet - animals are the ideal "victims" against whose "oppression" you can "struggle." It's the Liberation Politics version of job security - like a college professor getting tenure. Unlike those Eastern European proletarian ingrates who finally told the international Left to stick Marxism up its collective ass, dolphins and factory-farm chickens will never tell you to go screw yourself.

(Side note: I'd be interested to know how many hardcore PETA-types have ever refused critical medical treatment on the grounds that the drugs and/or surgical procedures to be used had been developed with animal testing. I'd bet you could fit them all in my car.)

I suspect that many liberationists - not ALL, mind you, but many - really don't give a rat's ass about those on whose behalf they are supposedly fighting. Like all moral crusaders, they're out to feel better about themselves.

There is also a heavy element of self-aggrandizement and the sense of being part of some kind of Consciousness Elite. This leads them to ever-more-extreme positions, in order to remain in the avant-garde.

For these folks, it's just no fun anymore to focus on human suffering. There's a "been there, done that" feel to the whole thing.

Anti-racism, for example, has succeeded to a degree where it is the dominant thought-mode - more people agree than disagree - and therefore it no longer distinguishes one as a member of the Consciousness Elite.

Which does not for a moment mean that the fight for racial equality is over, only that it's just not as politically sexy as it once was. How, as a good revolutionary, can you epater les bourgeoisie when they hate the Klan as much as you do?

So you move onward and outward to animals or "the Earth," adopting lifestyle politics and nutritional practices which most people will never be able to follow. Presto! Romantic marginalization and revolutionary purity is yours again! You're a rebel, maaan!

Good Night and Good Riddance

I tried to sit through a showing of George Clooney's Good Night, and Good Luck. I really did. I had to leave when it became apparent the film was just another example of the American Left's historical mythology.

I should have expected as much. Honestly, the Left has such a hold on Hollywood that when I hear some director blathering on about how his next film is going to be "political" or about "issues," I pretty much assume it will be Left propaganda.

Don't hold your breath waiting for a balanced (or even honest) film about McCarthy, et alia. Remember what recently happened when Kazan finally got his due recognition? The white-lipped anger and venom on display? Decades later, when even the Russians have admitted McCarthy was not jumping at shadows, the American Left cannot admit the truth.

I mean, really... would it kill Hollywood to make a movie about McCarthy that acknowledges the reality of Soviet espionage and subversion?

McCarthy & Co. made their share of mistakes, to be sure, but the enemy they were fighting was real and dangerous. Following the end of the Cold War, mountains of espionage and intel documents became available from the former USSR. That, plus now-declassified materials from the USA (such as the Venona transcripts) show that if anything, Americans weren't paranoid enough about subversion and infiltration. McCarthy was not perfect - not even the best man for the job. But he did quite a bit of good. First, and most obviously, the mere act of going on the offensive against Red infiltration put the Soviets on the defensive. They were forced to become much more careful, cut certain recruitment operations altogether, and did in fact lose a number of key operatives to McCarthy and HUAC's investigations. In the end, he was not the best man for the job because he cast his net too wide and wasn't patient enough.

I find it interesting that "McCarthyism" is a voodoo word coined by the very same American Leftists who spent the entire Cold War bending over backwards to avoid confronting the true nature of the Soviet Empire - a political system that raised repression, paranoia and persecution to an art form.

From the first, McCarthy was loudly criticized by his foes - both CPUSA fellow-travelers and patriotic Americans who honestly thought he was wrong. People against him - common citizens as well as political figures - made speeches, wrote books and newspaper articles, went on TV and radio and readily availed themselves of every inch of their 1st Amendment freedoms.

How many were "silenced?" How many doors were kicked in at midnight, the entire family being dragged away for "questioning?" How many hundreds and thousands of journalists, political opponents, artists, scholars and intellectuals were marched away to re-education/work camps - many to never return? How many were tortured or summarily executed?

Even at the height of "McCarthyism," you could slam Joe McCarthy in a New York Times op-ed, published in your own name, go home that night and sleep like a baby. That's what Edward Murrow did, and if George Clooney considers such efforts to be "courage" on the level of some Russian telling Lavrenti Beria to kiss his ass, he is simply delusional.

I'm also getting tired of hearing about all these supposed "victims" of McCarthy and the anti-Communist effort in general. Who were/are these people? How often did the spotlight fall on good Americans whose words and deeds gave no grounds whatsoever to raise suspicion?

Is the existence of actual sedition among CPUSA members, leaders and sympathizers incidental or irrelevant to how McCarthy and the anti-Communist movement more generally are to be viewed?

Or, let me put it this way: Why is the word witch hunt used so often in conjunction with McCarthy or Anti-Communist unless the intent is to suggest that there were as many seditious Communists as there were actual witches in Salem - that is, none?

Why is this? Witch hunt, we hear, over and over again. Could it be that the American Left - and especially their Arts & Crafts contingent in Hollywood - doesn't want anyone to look too closely at just how taken they were with Communism, up to and sometimes including placing themselves in service to the USSR?

If the CPUSA had been a bona fide domestic reform movement, if the anti-communists really were chasing phantoms, that would be one thing. Then I, too, would demonize McCarthy and his allies.

But that's simply not true. The problem was real.

(For how real it was in Hollywood, check out Red Star Over Hollywood: The Film Colony's Long Romance With The Left by Ronald Radosh.)

Witch hunt, my ass. There were no witches in Salem, but there sure as Lenin were Commies in Hollywood and the State Dept.

Every serious discussion of this subject must begin with the recognition that the CPUSA was a subversive fifth-column movement controlled by a hostile foreign power. That is an objective fact.

Having recognized this, the question is (and was): What to do about it? The Bill of Rights is not a suicide pact; our society has both the right and need to defend itself from those committed to its destruction.

It fits the romantic self-image of the American Left to see those investigated by HUAC, the FBI and McCarthy as martyrs to Cold War hysteria and the eee-vil Right Wing, but the facts do not support that story and never did. Of course, Hollywood is the land of dreams, the American mythology factory, and the Leftists who populate it do their best to keep slicing the baloney.

On a more general note, I would call out both the American Right and Left for their respective blind spots regarding 20th century Communism.

The Right saw Commies behind every tree and was tone-deaf to the real social and political injustices that made Communism so appealing to many suffering and oppressed people.

The Left, on the other hand, bent over backwards to avoid noticing just what all that rhetoric about workers and peasants and liberation actually resulted in once the Reds took over a country.

Many on the Left spent the decades from the October Revolution right up until the whole thing collapsed in late '80s hoping against hope that the USSR or some other Red State (heh heh) would incubate the long-awaited socialist utopia and lead humanity into a new and better age.

Marxist Communism is a remarkably "religious" ideology in this way, and those Americans who went beyond hoping and decided to lend a hand - Hiss, Fuchs, Rosenbergs, etc. - were not villains; they were True Believers. They honestly thought they were on the right side of history and human morality. (All the same, this doesn't excuse treason or change the propriety of their punishments.)

Nothing dies as hard as a dream, and it should come as no surprise that many on the Left (in Hollywood and elsewhere) were, and are, loathe to consciously and publicly confront the fact that the Bolsheviks and Maoists took them for a ride, used them and made a mockery of their democratic, egalitarian and emancipatory ideals.

That's why episodes like l'affaire Rosenberg is still so touchy for some. It's bad enough to have to admit that they were spies, after all, but to also acknowledge that they were naive stooges of a horrible tyrant is just more than some people can bear.