I first became interested in computers in the 1970s, back when cars were big, hair was long, gas was cheap, and beer cost 25 cents a glass.
I was working at the Champaign News-Gazette as a wire editor while going to school at the University of Illinois. Most newspapers back then were produced on typesetting machines that cast type out of molten lead. Computers were big hulking things that required special air-conditioned rooms and trained operators. We were about to get one, our first newsroom editing system, to drive our fancy new phototypesetters. So I took a computer science course taught by Daniel Slotnick, a pioneering supercomputer designer.
What I discovered was that the priesthood of mainframe computing was about to get its world turned completely upside down.
The agent of this change was the microcomputer. There were only a few around. There was a Commodore PET, the first all-in-one microcomputer, in one of my computer labs, surrounded by terminals connected to the CDC Cyber-9000 mainframe. There were some early Apple II's, of course, and a really cool Cromemco in a campustown store window, drawing colorful patterns randomly on a TV screen. A revolution was in the air.
The geek wire-wrapping connections to assemble our newsroom computer snorted derisively. Toys, he said. Computers will never be reliable enough for regular people to use.
We all know how that turned out, but many of us don't know how much chaos ensued.
It was not Bill Gates versus Steve Jobs in those days. Not by a long shot. There were hundreds of companies with competing systems and software. Zilog was the big dog before Intel. CP/M was the hot business OS before DOS. Atari, Radio Shack, and a fleet of companies started in dormitory rooms were in the game. It was years before any sort of clarity emerged.
Now we're right back there. Tablets are going to change the world. But we don't really know who, or what, or how. The iPad is just an opening round. Watch for Android, WebOS (HP), ChromeOS, QNX (Blackberry), Meego (Intel/Nokia), and maybe even something from Microsoft if they quit doddering around like Abe Simpson. Some players (Canonical/Ubuntu) have already dropped out. Others will join in. And then there's the whole Apps versus Web question.
As so often is the case, this is a revolution long in coming. I saw my first tablet around 1995 at a newspaper conference in Berlin. It was built on a chip called the Acorn RISC Machine and ran software called RISC OS. The descendants of that chip power pretty much every smartphone today, as well as the iPad. With so much time for these ideas to cook, you'd think they'd be done by now, but it doesn't work that way. Instead, we find today that we know less about the immediate future than we did five or eight years ago.
Some things we should have learned from previous revolutions and skirmishes:
You can't learn by sitting on the sidelines. Of course you should watch what everyone else is doing, but there is no substitute for getting into the action.
Don't rush into a permanent committment. If you put all your efforts into the iPad, and it turns into a minor player (as so often happens to proprietary technologies, you're going to be hurting. Be prepared to play the field.
Great tech does not always win. If it did, we'd be watching movies on Betamax and computing with Amigas. Adequate open systems tend to beat great closed systems.
Overnight change doesn't happen overnight. The usual mistake is to overstate the short term and underestimate the long term. You get all excited that this will be the Year of the Tablet, and it turns out to be the Year of Astounding Hype. So you turn your back and get clobbered by the real wave.
You're doing it wrong. No matter what you do, it will be wrong -- criticized from every direction and ultimately crushed by something new. Be OK with that. Live to learn.
There's more than one way to do it. At the moment, there's a lot of excitement about iPad apps that transport print experiences into a digital framework. There's also a lot of excitement about completely new information experiences that don't even vaguely resemble old products. Both can be right, for different people and situations.