image credit: U Penn Library Exhibit, "John W. Mauchly and the Development of the ENIAC Computer"
It's a common complaint that any mention of virtual worlds has ebbed in the popular media, and one reason given has often been the shift to mobile devices and tiny screens. Certainly that describes my students' preferences for online devices: about 90% of the e-mail I get from students comes from their phones.
I have met stiff resistance from colleagues wedded to desktop and laptop computers when suggesting that we need to make mobile computing the focus for our efforts with virtual worlds and more. For some historical precedent about this, consider an argument put forward by John Markoff in What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry. In this, Markoff clearly realizes, as Tim Wu did a few years later in The Master Switch, that some technologies overturn entire industries and ways of communicating:
Indeed, the hallmark of each generation of computing has been that its practitioners have resisted each subsequent shift in technology. Mainframes, minicomputers, PCs, PDAs--at the outset of each innovation, the old guard has fought a pitched battle against the upstarts, only to give in to the brutal realities of cost and performance.As Moore's Law makes our hand-held devices more powerful, I suspect this will happen again. For the latest shift, it will mean that something the size of a smart phone will be our primary computing device on the go or, when attached to virtual keyboards and easily accessed monitors, nearly everywhere else. Here's a picture from the year 2023:
You enter your office and look at something like a large-screen television hung on the wall above the desk. You speak a login keyword. The phone, linked to the global data-cloud, remains in your pocket as you begin to work, using gestures in the air while in range of the television's scanner. Windows for e-mail, a spreadsheet, and a calendar appear and you move them around with your hands while you issue voice commands. To input text you simply speak, and the voice-recognition software in the phone translates this to text. You finish just before a face-to-face meeting with colleagues, and walk down the hall. In the conference room, there's another big television, and with voice alone, you begin to talk. The notes taken in your office appear on the wall.
I will be a very late-comer to mobile computing, when I get a smart phone this fall. I don't fancy my iPad all that much, finding it must useful for quick browsing to, say, check the weather or read an e-book. That may well change. For the longest time, Markoff notes, printing was one of the biggest hurdles for personal computing. When these puzzles get solved, such as providing big screens and input devices for mobile computing carried in a pocket, progress happens rapidly.
In a world with haptic and voice interfaces, as well as a robust data-cloud, we should get ready to say farewell to both desktop and laptop in fewer years than we might imagine. Then, imagine the students' gesture of neurotically clutching their smart phones to see as antique as clutching a magical talisman.
4 comments:
Cognitive dissonance is strong among those of us that clawed our way up the steep learning curves with WordPerfect, DOS, BBSes, Windows 1-8, & Virtual Worlds. We aging baby boomers have eagerly accepted softer Executive office chairs and larger screens.
Now those youngsters are wanting everything tinier so they can take it with them.
Technology, like our guns, will certainly get smaller and more powerful.
Has humanity been taught correctly how to deal with it properly? Or should we just accept the fact that we will eventually merge with tech and everything will just be alright?
At a certain point, a gun won't kill. If my Adventure Team GI Joe shot me right now with his 1/6 scale .45, it would burn like all getout but I'd step on his puny plastic body like Godzilla.
Cecil, I don't know why you bring such thoughts into my addled brain.
On the other hand, computing can continue to scale down. Unlike guns, it obeys Moore's Law. The limiting factors seem to be bandwidth, or we'd already have something akin to the Holodeck.
We've not been taught to "deal with it properly," so we merge like stumbling students clutching their smart phones. I'd rather have it invisible and inside me, with an off switch for when I need to focus on only one task. That's the sort of convergence I see; we'll still look human, be inside we won't be any longer; we'll be cyborgs.
I hope that office of the future has a lot of insulation so I can talk to my devices in private. Talking certainly won't work well in the new open office arrangements that are prevailing with no cubicle walls.
I'm 62 and have eagerly embraced each new technology, always having a sense that we're not there yet, so I don't think it's an age thing as much as a habit of mind thing, but I'm weird.
Post a Comment