Showing posts with label Tim Wu. Show all posts
Showing posts with label Tim Wu. Show all posts

Tuesday, May 7, 2013

Deformed: Virtual Worlds & 1970s Computing History

Mother of All Demos
Location: Watching Douglas Engelbart's "The Mother of all Demos"

image credit: New Media Consortium at Flickr

Many of us who have dabbled with virtual worlds have wondered how they could form a constellation of networked systems, in much the same way as the Internet's servers do today. Whatever the fate of this niche technology called virtual worlds, from the failure to run on mobile devices to the inherent boom-bust cycle of Second Life's particular brand, the road not taken always beckons.

There is an historical precedent here, and it's one that has a happy ending. Could the same be true for virtual worlds?

Today's Internet: Clarke's Law in Action

For a moment, consider the complex and delicate magic that occurs whenever we use the Internet. My university server talks fine to Google, for work such as the just-completed Usher project. Whenever the Outlook mail client randomly eats student file attachments sent to me, I smile. Ah, Microsoft's wonderfully Byzantine and wonderfully doomed, technology, eating even its own Word files. Cue Apple and Google, as the Ottomans on the horizon, slowly gobbling up a once mighty empire.  Good riddance.

Then, because of the lack of monopoly that Microsoft coveted and almost got, I have the students try, try again with Gmail. Excede's servers send me the results and, once I type, transmit my thoughts--profound or inane--from home travel via satellite to Google.  When I send notice of this post to interested folks at Twitter or Facebook, the servers hosting that data all "talk" to one another.

Types of Gardens and a World-Wide Web, 1975?

Compare that to virtual worlds technology, ostensibly part of the Internet since that is how we access it. Second Life, InWorldz, and many others that share core technologies could, in theory, speak to each other. Had development not branched off as it has done, Linden Lab and a few other grids might have pioneered a system for avatars and inventory to travel from world to world.  That happens with OpenSim Hypergridding, a technology that John Lester promoted, before his work for Reaction Grid turned to Jibe-based 3D worlds.  But "interoperability" died years ago at Linden Lab, and it seems unlikely to return.

It's curious, this set of walled gardens. If today the Internet resembles Borges Garden of Forking Paths, Virtual Worlds resemble something else: the road taken in the 1970s toward personal computing.

I realized that while reading John Markoff's What the Dormouse Said, a history of early computing got influenced by the American Counterculture:
When personal computing finally blossomed in Silicon Valley in the mid-seventies, it did so largely without the history and research that had gone before it. As a consequence, the personal-computer industry would be deformed for years, creating a world of isolated desktop boxes, in contrast to the communities of shared information that had been pioneered in the sixties and early seventies. (179)
The Internet did not begin with Al Gore, whatever he may have claimed. It did not begin in Jobs' family garage and with Steve Wozniak's brilliant hardware hacks. It did not begin at Xerox PARC with the Alto. The personal computer with a GUI and mouse? Yes, we can credit or blame Xerox and Apple for that.

But years before, nearly every element of the modern Internet would have been possible with the Augment system, developed under the leadership of Douglas Engelbart. Yet that development stalled and ended, a revolution stillborn.  I think we can see an analogue for what is going on, at this cultural moment, with user-generated virtual worlds.


Engelbart's Mouse

Want to see what might have been for the Internet? I am convinced that had something like Augment  been made less opaque for casual users, we'd have had an academic, and perhaps consumer Internet in 1975. Engelbart gave a show-stopper of a demo in 1968, with mouse, text-editing with clipboard and copy/paste, multiple files, graphics on screen, electronic mail, hyperlinks tagged to graphics, and remote visitors via a network.  You can see what he was doing with Augment at these videos from Stanford.

The reasons for Augment's failure are complex; Markoff's book does justice both to the creator's vision and his ultimate failure to produce a widely adapted product.  What happened, however, for consumers was the emergence of walled gardens and proprietary systems from Apple, Microsoft, Digital, Tandy, and other competitors forgotten except by historians of technology.

When the Internet emerged, it came late to a culture of desktop boxes that could not, generally, talk to one another.

What if the personal computer revolution had begun with networking? And similarly...

What if Virtual Worlds Had Begun with Interoperability?

I'm writing an article about one group of USENET hobbyists who have made the jump to Facebook, because the old .alt group proved too chaotic and full of spammers, trolls, and other bottom feeders. They also made the leap because, frankly, .jpgs and text import and export well between applications. Text did in Engelbart's day.

Little aside from these, plus Collada files and some other graphic formats, can move between different virtual worlds.  Standards for inventories, for avatar meshes, and for "land" templates are different. In this technology landscape, OpenSim grids serve as today's Augment. Managing a bunch of avatars and a region in OS is hard to master, not stable in my experience except in the hands of a pro, but interoperable. Running an entire campus-hosted grid would be lovely, but it's beyond my time or expertise to learn.

Other products with potential beyond SL's technology, such Unity 3D and Jibe, produce elegant worlds, but they don't talk to other worlds and expert designers need to craft objects. They do offer vast potential, according to OpenSim pioneer Adam Frisby, for scaling, running on mobile devices, and improved grahics.

That sounds great until one considers faculty skills-sets and what it takes to build with Unity or Jibe. As noted  before in this blog, developing for these platforms may be within reach for architecture and engineering students, but at my university, it's challenge enough to get students to juggle multiple e-mail accounts and embed files from YouTube into their blog posts. We faculty lack time and incentives to do more with them, let alone learn 3D applications such as Maya or Blender.  Yet nearly all of us at my school have created content, mostly with text and images and sometimes digital video, and shared it on the Internet.

For all its limitations and toxicity as a brand, Second Life and, lets amateurs with a copy of Photoshop build easily. I'm told that Cloud Party does too, and I will soon try again with Cloud Party's latest build tools. Scripting remains something for those not faint of heart and projects to make visual scripting tools, such as MIT's Scratch for SL, remain as stillborn as Augment.

What it will Take to be Disruptive

Here comes a sweeping generalization, and I'm ready to fall on this sword if some wise person can prove me wrong. Virtual worlds will never be a disruptive technology, in Tim Wu's sense of the term, until they become an interoperable and popular tool for everyday life, as the Web and e-mail have become.

Had virtual worlds begun with a series of collaborative academic ventures rooted in common standards, rather than a group of for-profit start-ups from The Valley, we might have that disruption and a 3D Web today.  Then the profits would follow, because in 1968, who could have foreseen eBay or Amazon or Facebook?

Right now, however, it's still 1968 and we've all seen the potential of a disruptive technology, as those who watched Engelbart's presentation did.

So today, who will build the 3D Web?


Thursday, March 28, 2013

Historical Precedent: Mobile Computing & Our Unease?

Location: In front of a large screen

image credit: U Penn Library Exhibit, "John W. Mauchly and the Development of the ENIAC Computer"

It's a common complaint that any mention of virtual worlds has ebbed in the popular media, and one reason given has often been the shift to mobile devices and tiny screens. Certainly that describes my students' preferences for online devices: about 90% of the e-mail I get from students comes from their phones.

I have met stiff resistance from colleagues wedded to desktop and laptop computers when suggesting that we need to make mobile computing the focus for our efforts with virtual worlds and more. For some historical precedent about this, consider an argument put forward by John Markoff in What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry.  In this, Markoff clearly realizes, as Tim Wu did a few years later in The Master Switch, that some technologies overturn entire industries and ways of communicating:
Indeed, the hallmark of each generation of computing has been that its practitioners have resisted each subsequent shift in technology. Mainframes, minicomputers, PCs, PDAs--at the outset of each innovation, the old guard has fought a pitched battle against the upstarts, only to give in to the brutal realities of cost and performance.
As Moore's Law makes our hand-held devices more powerful, I suspect this will happen again. For the latest shift, it will mean that something the size of a smart phone will be our primary computing device on the go or, when attached to  virtual keyboards and easily accessed monitors, nearly everywhere else. Here's a picture from the year 2023:

You enter your office and look at something like a large-screen television hung on the wall above the desk. You speak a login keyword. The phone, linked to the global data-cloud, remains in your pocket as you begin to work, using gestures in the air while in range of the television's scanner.  Windows for e-mail, a spreadsheet, and a calendar appear and you move them around with your hands while you issue voice commands. To input text you simply speak, and the voice-recognition software in the phone translates this to text. You finish just before a face-to-face meeting with colleagues, and walk down the hall. In the conference room, there's another big television, and with voice alone, you begin to talk. The notes taken in your office appear on the wall.

I will be a very late-comer to mobile computing, when I get a smart phone this fall. I don't fancy my iPad all that much, finding it must useful for quick browsing to, say, check the weather or read an e-book.  That may well change. For the longest time, Markoff notes, printing was one of the biggest hurdles for personal computing. When these puzzles get solved, such as providing big screens and input devices for mobile computing carried in a pocket, progress happens rapidly.

In a world with haptic and voice interfaces, as well as a robust data-cloud, we should get ready to say farewell to both desktop and laptop in fewer years than we might imagine. Then, imagine the students' gesture of neurotically clutching their smart phones to see as antique as clutching a magical talisman.

Saturday, February 2, 2013

The Phone Book's Here! The Phone Book's Here!

Location: Front Porch

Verizon's phone book landed on the porch a day or two ago, and only this morning did I get around to unwrapping the skinny little thing. It's like the Ghost of Phone Books Past, to this techno-Scrooge.

In another era, the arrival of the new edition would cause quite a flutter. Misplacing the phone book was also unthinkable. They were not easily or cheaply replaced, and my father guarded it like the Dead Sea Scrolls. In an otherwise chaotic Lebanese-American household, the phone book held court on the telephone table. Nowhere else.

Now phone books are incomplete and small. Verizon banishes residential listing entirely to a Web page, a crowning irony since most of the land-line stalwarts I know are so old that a broom provides the interface for web access in their homes.

Once, however, the phone book provided all sorts of diversions. I've three degrees of separation from musician Frank Zappa, but only one from his phone book. A grad-school friend named Rick, when living in Los Angeles, once had the job of delivering the huge directories to the homes of the famous and crazy in that star-studded town.

At one address, he was ringing the bell when a hippie "who looked like a madman" stuck his fuzzy head out of an upstairs window. "What the hell you want?" he challenged Rick.

"Phone books!" Rick replied, and the hippie's face brightened and he came running, shouting "Oh man! The phone book's here! The phone book's here!" Rick quickly found out he'd delivered Frank Zappa's phone book.  Lord knows what chemically enhanced games were played with the directory.  I know that my friends and I played geeky games of generating random names for role-playing games by picking, in the I-Ching manner, examples from the phone book and mixing examples from white and yellow pages. Names such as "Lorenzo Plumber" or "Scrap Metal O'Malley" resulted, to our geeky delight.

Now the I Ching is online, so I asked of it "What is the fate of the phone book?" With six casts of the stones, I received this answer:
"Waters difficult to keep within the Lake's banks: The Superior Person examines the nature of virtue and makes himself a standard that can be followed. Self-discipline brings success; but restraints too binding bring self-defeat."
History of technology there? Tim Wu's The Master Switch, the text just completed in my course on The History, Culture and Future of Cyberspace, is all about those who establish standards for an entire industry, like Bill Gates, though not just for themselves. Some geniuses such as Steve Jobs foresee needs we don't yet have, and they push others to and past the breaking point to make the vision real and the consumer's need materialize.

In the era of mobile technology, where most of my students carry area codes from faraway lands, there's no sense in a phone book. Discipline is needed, however; my students get so lost in a web of constant texts and other inputs that they do not give sufficient priority to e-mail about classes, and they suffer as a result.
 
So as be blunder forward, without much direction or a good directory, into this connected era I will miss the nigh-sacred tome on the "telephone table" in my, and perhaps Frank Zappa's, dining room. 


Or, perhaps, another room in his hippie mansion.

Friday, January 20, 2012

Kodak, iBooks, and a Day We Should Recall

Location: Crux of history

Yesterday's technology news featured two events worthy of an annual commemoration: the Eastman-Kodak bankruptcy filing and Apple's announcement of its iBook initiative.

Both show how corporations can prepare for changing times...or not. Kodak, inventor of the first digital camera, did not market it because the impact would be disastrous to their film-based model. It provides yet more evidence why Tim Wu's model of "disruptive technologies" often get suppressed in the name of profits. Fuji and other companies adapted to changing times and Kodak proved late to the game.

Apple, the champion of technological comebacks, took a different route ever since Steve Jobs' return to the firm. Every iOS device released was lambasted, at first, by mainstream reporters. Jokes about the iPad in particular were sharp and pretty darned funny, to this observer.

Unlike Kodak, however, Apple took a long view of how the devices might disrupt their sales of traditional computers, always far behind those running Microsoft's OS. Yet with less to lose, perhaps, Apple could gamble big on the future of digital content. I got angry at Apple, not long ago, over the iPad. It seemed to be Jobs' "up yours" moment to Mac loyalists.  Now, the post-Jobs Apple plays the two computers as one system: create content on the Mac, show it on the iPad. Apple still won't put Flash on the iPad, but so far I'm happy with their device.

And with the textbook announcement, they realized something I had said for years: the printed textbook is obsolete. Publishers rush to release new and expensive editions that students must lug about and then resell at a loss. These paper texts lack multimedia. My analogy for this is a botany text I own and love: the printed and $100 version can have color plates from a cloud-forest in Costa Rica or the Great Barrier Reef. The online version would have live video-streams from Webcams and embedded video demos.  It would cost $20 and not be able to be resold.


Kodak wanted to sell you a roll of film. Apple wants to sell school systems an ecosystem: cheap iPad with publisher-vetted content that cannot be resold. Brilliant.

Why did publishers wait so long? Apple took the systemic and long view, while Eastman Kodak sat on innovation.

And thus empires rise and fall.

Personal PS: we disconnected our land-line phone yesterday, for good. Of all days! I do have a dumb phone, while my more social wife got the iPhone. Call, and I may get back to you. Eventually.

Friday, August 12, 2011

Grandpa's Box: My Thoughts on the Future of Our Computers

Location: Using Obsolescent Computing Tool

I don't know how many of my friends among Second Life's digerati are in contact with young people on a daily basis. I am, and I am stunned by how fast they are abandoning the personal computer.

The other day, I got a briefing from a Writing Consultant who works for me. She reviewed our Writing Center Web site and, while polite and praising the content, noted that the organization is all wrong for a student audience. "I no longer use my laptop," she admitted. "The iPad is my primary computer."

Much of our content, developed over many years and going through a vetting process with several bureaucratic levels, is all wrong.

It has happened with hyperdrive speed, this shift.  We are not an engineering or arts school: our students reflect typical affluent users.

Should I cheer at this funeral?

Iggy in the Confessional Booth, Before St. Steve


As a Mac-OS fanatic, I take no comfort in Apple's victory with portable devices. Windows-users, we are in the same boat, because the iPad is no Mac. Or Windows PC.

It's the anti-Mac, or better still, the final realization of Steve Jobs' dream of 1984: a sleek and closed-down platform with an elegant interface, but where one pays a price: Apple controls every damned thing. The iPhones and iPads are also hip examples of industrial design, just as the Mac of 1984 was no bulky (and sturdy) IBM PC of the sort I then owned. The IBM was for office-clones who had reluctantly given up their Selectric typewriters. The Mac was for artists and freaks.

It was a machine with a personality. Over time, it acquired a soul after Jobs' hammer-lock on hardware design was yanked away. Except for Extensions conflicts before OS X. But we won't go there...Jobs' "second coming" swept away the Old Order.

Jobs is a man with Olympian ambition and an insanely great idea or two: his ideal factory would take trucks of sand in one end and ship out computers at the other end. He wants to own the whole system. Henry Ford was smiling from plutocrat-heaven when he looked down at St. Steve.  Tim Wu understands this well in his book The Master Switch. Wu takes some well aimed swipes at Jobs.

Sorry, Steve. I really adore your OS, but I'm thinking of backing Google with an Android purchase. And--horrors--I think I'll be playing Mass Effect on an Xbox 360.

Confession #2: I'm going to buy that in several months, mostly to play the latest Mass Effect.  Let's give the Devil his due; Microsoft kept their corporate wet-blanket culture off the gaming division, and out of that we got the Kinect.

Trouble for Granda's Box

My "Nerd-Night" tabletop RPG group consists of one Mac-guy (me) and a bunch of Windows-based gamers who get away from their MMOs to go "old school" and roll some d20s. Some of them own console games, but not a one uses a smart phone or tablet.

They are dinosaurs as surely as I am. They don't get why the next generation of computer users are eschewing desktop systems for portable devices. I, on the other hand, get it.

Millennials want to always be in touch with their hive. They need self assurance and confirmation of their choices, and they do their social planning on the fly. That's impossible from a tethered desktop or even a laptop. I've yet to see more than a scattering of students use a laptop outside, as university promotional photos often show. Instead, they compute as they walk across campus. Give them data glasses that look just like sunglasses, and they'd use them too. Just don't make them stop, even for a nanosecond.

I recently had an epiphany that the wider culture is also getting it when I saw this Scott Adams' cartoon.

Dilbert.comMy buddies and I are Dilbert. My students, the young fella.

I don't know how Apple's closed system will fare against the Android OS from Google. Poorly, I secretly hope. But as a Microsoft-hater, I am also pleased that Windows will be the biggest loser of all. The company, except for its Kinect, has been no innovator in recent years. Was it ever? Steve Balmer has the cool-factor of Dilbert.

Whatever our desktop OS, I think we old timers with our grandpa boxes will look back at the System Wars of the 1980s and 90s with nostalgia. Like many other hobbies I embrace, from model-building to boardgames about World War II, the rest of society has moved on and I'm in this eddy of forgotten time.

I see a future in which content creators will use powerful computers in some form. The rest of the public--the consumers--will want to be close to the Machine. Machines as easy to use that they are ubiquitous, a part of our bodies.

It's what Sherry Turkle of MIT has called "always on, always on you" technology. Get ready for it.

But I still don't own a cell phone...my "dumb phone," a pay-as-you-go model, expired in April.  It won't be missed.


Saturday, January 29, 2011

Living in Gibson's Future? Close Enough to Be Scary.

Location: Midpoint of Boston-Atlanta Metropolitan Axis
image credit: Barclay Shaw

The students in my first-year seminar may not understand, for a book published before they were even born, how William Gibson's Neuromancer not only shook up the genre of science fiction but also helped spawn a revolution in how those building the Web and its applications regard information

When I created a Second Life account in 2007, I left it an open question, in the "first life" tab for the avatar, as to whether Gibson's future would be realized in SL. Now I have the feeling that my question needed a broader scope: has Gibson's future been realized in both the worlds of, to use his dichotomy, meat and mind? If anything SL tries hard to be Gibsonian but fails: no one is as immersed as Case is in the novel, nor, for that matter, Molly and Armitage in their own part of futurity.

Mostly, however, Gibson got things close enough to scare me. As Egypt's government shuts down Internet access and cell-phone services, I recognize an eerie parallel to Wu's idea of a Master Switch that did not exist in 1984, when Neuromancer appeared. Gibson envisioned a more open Net than we have or that Wu believes we might get, but Gibson conceded that the "spiral arms of military systems" (52) would be forever beyond the reach of loners like Case.  They still are.

Yet using free public software, a million disenfranchised loners have united to topple Egypt's government, and at the time of writing this they may get their wish.  The keepers of those spiral arms, the Egyptian government, like their counterparts in Iran a few years back, know that their foes need cell telephony and Facebook to organize.

The rest of the world can only sit back and watch the chaos and, in Saudi Arabia, the government may be patting itself on the back for shutting out Facebook not so long ago.

But if Gibson were actually prophetic, then other aspects of his future should be closer to our door than the streets of Cairo.

Zaibatsus and Archologies

Put a Fuller Dome over many exurban gated communities in the US and Western Europe and you'd have the residential component of Neuromancer's way of life for the megacorporations: happy corporate employees going to their desks and conference rooms to plan and build a world to maximize a company's return on investment. Shopping and entertainment are only at a small remove, along the suburban sprawl, from the home cocoons. Only cheap oil has meant that workers drive to various destinations, instead of living and working and shopping together in the same place.


What would an arcology look like? There is one very Gibsonian building at the Web site of Yanko Design.  Called NOAH (New Orleans Arcology Habitat) it evokes the end-times of a next Big Storm and the sense of heroic futurism that the megacorps might project in a post-governmental world.  The comments on the proposal fascinate me: techno-dread and triumphalism characterize the remarks.

Following the logic of governmental minimalists and libertarians in the United States, whose voices are loud these days, a future where self-defining communities evolve into self-governing ones may not be so far-fetched. At the consensual level, it may already be as close as one's neighborhood covenant.

But the coming of Corporate Man (and Woman) has concurrently meant an increased stratification of wealth in US communities. Bruce Sterling once quipped that the 1980s dawned not only as the first American decade that seemed like science fiction but also a return to a neo-Victorian world with social-Darwinist conditions for citizens. How did that vision, crystallized in Neuromancer, pan out?

What follows is less critical than it might seem. In fact, as I'll note, I'm not sure it's ever been otherwise, except for its sheer scale. In 2007, the 1% of American citizens controlled 43% of financial wealth and 35% of net worth. The title of the report where I gathered this, "Who Rules America," gave an ideological slant that made me a little doubtful about the lack of bias by Professor Domhoff, so I found another source.

The US Census Bureau's report,"A Brief Look at Postwar U.S. Income Inequality" paints a picture not unlike Dumhoff's:
The long-run increase in income inequality is related to changes in the Nation’s labor market and its household com- position. The wage distribution has become considerably more unequal with more highly skilled, trained, and educated workers at the top experiencing real wage gains and those at the bottom real wage losses.
There's more at the Census Web site. I become a little less convinced, then, that Dumhoff is riding a hobby horse of leftist angst. One should pause before numbers like these from Domhoff: "Of all the new financial wealth created by the American economy in that 21-year-period, [1983-2004] fully 42% of it went to the top 1%."

Even so, the distribution of wealth before 1983 was not one of which I'm proud, given my belief that a large and politically active middle-class of property owners, such as all those returning GIs in 1945, can be an excellent shield against the sort of governance (and lack of it) Gibson portrays.   Citizens like the GIs who built the neighborhood were I now live, of modest ambitions and demeanor, have historically been  eager to protect their families and thus they've supported strong policing and orderly neighborhoods. Except for the turbulence of 1968, America has never had Egyptian-style chaos in the streets: Egypt has no middle class any longer, by all accounts.

Sadly, after 2008 and the financial meltdown, many more Americans have been thrown out of our middle class. Each foreclosure and failed company could take us a step closer to a neo-Victorian form of income inequality that characterizes Cyberpunk's view of the future.  And I doubt that all of the millions of dispossessed will leave their bank-seized homes peacefully. 2011 may be an interesting year.

Console Cowboys

But housing, like travel to Case, is a "meat thing." On the mind front, the lone cyber-cowboys of Gibson's world have not emerged. Wikileaks' revelations took a team of dedicated "hacktivists" to accomplish. At the same time, another aspect of current hacking comes straight from Neuromancer.

The cyber-attack on Iran's nuclear installations may have set back the nation's bomb-building program for some time. In a world like Gibson's, a zaibatsu might have stopped a rogue nation, given the failure of the US and Soviet governments after Operation Screaming Fist.  In our actuality, Israel and the US and who knows else might have launched the attack.  Thus we've moved further along than Gibson's world, technologically.

The Soviet computer centers Armitage mentions had to be attacked with Special-Forces hacker-operatives sent in on microlight gliders. In our world, Iran's systems are tied to the global Internet. So are ours, and both China and the US are quietly jockeying, behind the scenes, to develop cyberwar teams. If our nations ever go to war, Internet services and all its dependent systems--power grids, air-traffic systems, banking--will become legitimate domestic targets.

Street Samurai

On the "meat" front, our Molly might be named "Mike."


Burly men doing legitimate (and, at times, dirty) deeds for whatever Blackwater is now called, and firms like it, does not quite have the sex-appeal of Gibson's razorgirl.

At the same time, for this reader it would have been impossible in the early 80s to imagine Mercs, and that is what all "private security contractors" are, Mercenaries, enjoying the cultural acceptance they now enjoy. Many of my fellow citizens get a bit queasy on this subject, but for the past decade the US government has relied on Mercs to do jobs we do not wish to task to our armed forces or that require manpower we simply do not have in uniform without drafting college-aged kids.

In the 80s, except for Soldier of Fortune readers, the use of mercenaries by our government would have stirred widespread outrage. Now, if I may risk a generalization, with a 24/7 new cycle the attention of the public appears to move faster than it did 26 years ago, to the latest tragedy, scandal, or political circus.  Meanwhile, private security companies have limited accountability to US or international law. They are true Gibsonian figures of the "interzone where art wasn't quite crime, crime not quite art" (44).  Put in "law" for "art" and you have our situation.

Meanwhile, poor Molly. I don't think Gibson meant her to become a meme, but she has.

I'm glad we don't have razorgirl mercs walking the night cities of our world, but Molly can be found elsewhere: she's the badass sex-object of a hundred video games and action / adventure films. Hollywood understands male lust for these tough babes with guns and blades. Molly scares and attracts the boys, and I'm sure that more than a few male readers empathize with Case when he gets to enjoy her pleasures.

Naturally, one recent attempt to cast a film based upon the book would have featured Milla Jovovich as Molly: Resident Evil's killer star would get to put on the mirror eye-inserts. It's that way in any number of films or games where the strong woman is just as much a sex object as any gatefold playmate.  As my students will see when they get further into the book, there's that dark and disgusting desire out there to have a "meat puppet."  That technology is perhaps the most frightening of the book and, like organ-markets and stim-sim sets, it remains possible but not part of our lives.

We have Microsoft as a multinational, not microsofts that one puts into a skull-plug to learn a new skill. And the best we can do for Stim-Sim is the current 3D craze. Merge it with a Wii or Kinect, and we may approach something like watching through Tally Isham's eyes in her latest episode.

Better Living Through Bioengineering?

When students on campus trade Ritalin to do better on exams, yes, we live in a cyberpunk world. My students are afraid of our "street" and, whipped up the ladder by well meaning parents and educators like me, do their best to avoid it whatever the emotional cost.

At least, if any of us do wind up on the streets with between two and three million American homeless, we do not have to cope with illegal dealers in pituitary glands or black clinics in the worst part of town, ready to roll us for our kidneys.


In scope if not depth, Gibson was spot on about the culture of body modification, when teens have elective cosmetic surgery and their parents get Botox regularly. A single earring was a novelty on a man in 1984, and potentially a dangerous one. Now tats and piercings are common.

A dermatologist, who helped my mother with a degenerative eye disorder that required a monthly Botox injection, told me he'd given up on really iffy surgery that could land him in court. Instead, his income came mostly from vanity, that ready fix of Botox to keep the bathroom mirror lying to his patients every morning.

Verdict: Close Enough to Scare Me

Image Source: Unknown

In the end, we don't, and probably never will, have orbiting colonies of Rastas or French spas run by a maniacal and inbred family. I doubt that our planet has enough fossil fuels that are cheaply available to sustain enough wealth to make dreams like Richard Branson's, of cheap space tourism, happen. If I'm wrong, I'll be too old to get the ride to orbit I've wanted since I saw a Gemini launch in the mid 1960s.

In other regards, however, I'm happy to say that Gibson got a great deal wrong. May it continue to be so, because what he predicted correctly has been frightening enough.  Ours is a harder-edged world than it was in 1984, when the Cold War raged on.  And the technologies of interconnection empower lots of individuals. Al Quaeda, not the Panther Moderns, uses the Internet but the violence engendered is still the stuff of nightmare.

So, Bill, thanks for all of the bad dreams. You really invented a genre and, in some ways, a world.

Work Cited:

Gibson, William. Neuromancer. New York: Ace, 1984.

Monday, January 17, 2011

Apple Without Steve?

Location: Reading Tea Leaves

Apple's die-hard faithful probably let out a collective moan today. Steve Jobs' decision to take medical leave sent shivers through not only computer users but also the entire stock market.

I wish Mr. Jobs a speedy recovery and a long life. While the man has been accused of draconian management practices, he did help usher in the personal computing revolution. Even if Steve Wozniak was the engineering genius behind the early Apple computers, Jobs' marketing brilliance made graphical user interfaces and the Cult of Apple happen.

It was a different sort of company from the upstart PC makers of the early 80s or the stumbling and arrogant behemoth that was IBM in that decade; if the competition consisted of geeks, Apple seemed run by artists and madmen of the sort I enjoy meeting for drinks.  I could imagine Apple's people chatting up the fire-breather and magician in San Francisco's Vesuvio, one of planet Earth's best bars, after shopping for poetry books at City Lights. IBM's folks would be (massive yawn) golfing with senators. Microsoft's people would show up at the bar and try to look hip, all the while looking like khaki-and-polo-shirted  tourists.

But I digress: my preference for bohemians over boring business and I.T. types is not news, but perhaps this is exactly how Apple fooled chumps like me into thinking the firm hip and not merely a very clever marketing idea wrapped around some elegant and useful technology. It sure snared cultural creatives as fanboys, to market the company's products to friends. I'm sure I convinced a dozen folks to try the Mac OS over the years. Not a one screamed at me later, a testimony to how well our Macs did, and do, work.


More Than Macs: How Apple Did "Think Different"

The history of the firm relates quite well to ideas from my class on the history of Cyberspace. Jobs will be remembered, when he leaves this planet to compare notes with John D. Rockefeller and Howard Hughes, in one of two ways.

First, Apple's CEO might be what what Tim Wu in The Master Switch calls a Defining Mogul.  That will be the case if Apple's closed system for handheld computing comes to dominate the future of Internet use.  You will get what Verizon, AT&T, and Apple decide you get to see or do. And I agree with Wu's well supported "central contention....in the United States, it is industrial structure that determines the limits of free speech" (121). I suspect this revelation about the first amendment applying to Congress, as in "Congress shall make no law..." and not to employers or ISPs, surprised my students as much as it did me.

On the other hand, if an open system dominates, Jobs might be recalled as a would-be monopolist who failed, despite a miraculous comeback in the late 1990s. I'm not sure which future I want. I used to fear Microsoft. I was hoping Macs would remain boutique systems for picky people like me. I also prefer German cars or pre-1973 Detroit Muscle. They are fancy fun, but they both run on our roads with boring cars. If my Mac could work with the sea of boring Windows boxes, what harm that?

Now, however, I am beginning to fear Apple.

My ownership of a handful of Mac systems since the dark years of the mid 90s shows how well they hold up. Our household iMac G5, now six years young, will soon enjoy a long retirement as our DVD player / media box for downloading films. Some child in Alaska is probably still using the G3 iMac I shipped up there, years ago, after an eBay auction.

Inside the latest iMac, however, there's a story that goes to the heart of Jobs' vision for Apple.  Unlike its predecessors, the new iMac's CPU has no user-serviceable parts aside from upgrading the RAM. This has long been Jobs' favored tactic. If the Apple II of Wozniak's day invited tinkering, the first Mac in 1984 became its antithesis: to open the case was the void the warranty!

Apple has played this push-pull game with its hardware since that time. As a tinkerer, I found this exasperating as my friends using Windows or Linux would build computers out of spare parts and make them do cool things.  Nowadays, however, we old geezers with desktop boxes are giving way to the iEverthing generation, who carry their devices around incessantly and don't tinker. It's about the app, not the device, to these kids. And Jobs--who not only could recognize but also create markets--knew this best of all.  The Mac's established niche among school teachers and artists and graphics designers and video professionals and crazy professors is, after all, somewhat limited. Apple did not even try to top Dell's game in Henrico County, where the middle schools will give up their Macs soon for PCs, just as the high schools have done. Though the Macs last longer and are more robust when dropped by little kids, Dell offered a better price on service. Apple declined, I suppose, betting that kids with iPhones and iPads will eventually want Macs, anyhow. Or that iPads might replace laptops in schools, and with the right Verizon contract Apple would best anything Dell could offer.

The educational discount, I just noticed, for the iMac I am considering comes to all of 100 bucks. It was once twice that. Second Life folks...does this sound at all familiar?

This reality suits Steve Jobs fine: his portable devices, selling in volumes we Mac zealots could only dream of in the late 1990s, are closed in ways Job could never manage with the Macintosh in any form.  We'd pry them open.

Try that with an iPhone. Maybe you have better eyesight than I do. Besides, even if  you do that, Apple's revenue-stream will switch from selling you a $2000 iMac (high end: I like my computers pimped) to selling you a $200 phone with a long-term contract that includes a cut for Apple with each App you buy and, I'm guessing, a bit of the monthly take that goes to the ISP carrying your data.

That's a sweet deal for all the corporations involved.

Consumers might not care, as long as the service is reliable, but as with AT&T before the 1984 breakup, with the iOS Steve's firm can dictate which apps appear, or do not, on the system. The Mac OS lets you write software; its relative openness, based on its UNIX underpinnings, compared to Windows has been great for those who had the skill to develop applications.

But What if Steve Loses (Again)?

Jobs lost the desktop market to Microsoft because he could not see what Bill Gates saw: that the software was the key to market saturation, even good-enough software. Most Windows users wanted the apps, and they could care little about the OS as long as it worked. I know that described me when I used Windows 3.1, 95, and 98 before making the Mac OS my primary system.

Could Jobs make a similar mistake today? As Google becomes more and more a competitor for the iPhone's and iPad's apps market, Apple could repeat its 1980s blunders. At the same time, it has some advantages it never had then.  Moreover, might Google  be forced into a more closed system? As I explained to my class today, Google must use the same broadband "tubes" (thanks, Sen. Stevens) to move data to one's computer or phone.

I'd feel better if I could get an Android OS tablet (what I'm waiting for) with service from any of dozen ISPs. Perhaps if some new and distruptive wireless service emerges, offering FIOS-level speeds without a cable, we could have 100 Verizons or Comcasts vying for our telecom dollar.

But in my region, it will be AT&T and Verizon carrying the data. As long as principles of Net Neutrality continue to hold for these big telecoms, in theory iPad / Phone competitors could run most anything and undermine Apple's closed-but-reliable model for access to content online.

If that happens, someone will have stolen Apple's message from the famous "1984" advertisement. I offer it here in case my students have never seen it:


I'm hoping for this outcome, even if it means that in a few years, I'll lose my Macs as Apple fades away (again). Maybe for my main computer I'll be running an admittedly sleek Sony Vaio with a Google OS on board and I'll have lots of options, including synching data with a tablet or smart phone. Maybe I'd even make peace with the old bogeyman, Microsoft. Windows 7 and clunky compared to the current version of the Mac OS, but it is the first version of Windows in a long time that I find compelling and rather intuitive.  Perhaps on a really tricked out machine...

Either way, if I can open the computer's case and change the hard drive out or upgrade the processor, better still.

Friday, January 14, 2011

Big Visions, Little Visions, Second Life's Failure

Gone to his Head
Location: Hollering, at the Rebel Yell

I've written, at the VWER site, a long analysis of Second Life's failure to become the sort of disruptive technology that Philip Rosedale envisioned.  I'm thinking of Tim Wu's term for technologies that create entire new industries building a system to replace older forms of communication.

Other than a smug image of my avatar and a Philip-Rosedale parody-bot at the Burn 2.0 arts event, I think--think--I kept my snark in check.

I hope my claims about SL are not mere sour grapes over the end of Richmond Island in SL and the departure of so many educators from that technology.  It's premature to say we victors write the histories, because educators are still using SL and will, while OpenSim is very much a pioneer's environment. Moreover, Linden Lab might still recapture a niche market they are losing now and make their metaverse the standard-bearer.

That I doubt. Read the article and I'll explain why.

Wednesday, January 12, 2011

Wu's "Cycle" and Founding Moguls

Location: Using Google's System to Talk About It

This post begins a series to coincide with my Spring 2011 course, Cyberspace: History, Culture, Future.

We begin the class reading Tim Wu's The Master Switch: The Rise & Fall of Information Empires. I've loved Wu's narratives about the "defining moguls" in various industries. In my title I misused the term "founding" and that is not exactly apt. Such men may not found an industry but they do bring it to its apotheosis.  It's not accidental that Howard Hughes, whose company defined modern aerospace, continues to inspire Hollywood enough to get The Aviator made and released.

Such figures used to be called "Titans of Industry" in prewar America.  Either way, the metaphor is apt: Mogul emperors and pre-Olympian gods had one thing in common with these modern men: mere money has never been the primary goal or motivation. They are, Wu, insists, "a special breed" among the alpha-males and (today) alpha-females of our technocracy (29).

The mogul or titan of old desired power and domination, and their modern counterpart, "like a man who tastes combat for the first time, discovered his natural aptitude for industrial warfare" (Wu 29). No small wonder that they so interest Nietzschean me. I admire such fellows even when I dislike them: Steve Jobs made possible the amazing MacBook on which I compose this post. He also has a scorched-earth style of management that would make Kublai Khan nod in approval. Moguls are often brutes (and the titan Saturn ate his own children to avoid being deposed by one of them).

Adolph Zukor's and Mark Zuckerberg's names roll off the tongue, the similarities in vowel-sounds mimicking the same will to power that these men  brought to their innovations. Both Paramount and Facebook define a certain age of media. Competitors quake.

Then, of course, there is the fall of the Titans. In surveying the history of technology, it is common to find founders with Shakespearean flaws and, quite often, trace what began as heroic competition ending  in complacency, decadence, even madness. Aviator and Aerospace-industry legend Howard Hughes stands before me, an obsessive old man so terrified of germs, some urban legends say, that he put empty Kleenex boxes on his feet as he shuffled around his penthouses in various the Las Vegas hotels. Thus is the fate of many with imperial visions.

If any Second Life readers are still with me, I think you know who I'm talking about.  For my students, the question becomes this: what happens when a founder tries to build a system out of a disruptive technology?

Zukor managed to corral and cow the independent filmmakers and theater-owners who successfully battled The Film Trust. Out of that victory, Zukor and other winners made his system: we still use the world "Hollywood" to mean not just a place but a content-creator that is monolithic and quite often stodgy. Consider how every formulaic superhero film resembles the previous formulaic superhero film. We can thank Zukor and his peers for this.  But we can also thank them for some unforgettable moments in cinema: Ben Hur's chariot race, Dorothy's trip to Oz, A Streetcar Named Desire's heart-rending portrayal of Blanche DuBois by Vivian Leigh.

As for Facebook's Zuckerberg, it remains to be seen if his social network can become a fully fledged operating system for users, in the way that Google seems to be evolving. If so, we could have a ringside seat for a battle of systems, one that will influence our lives as much as the ascendancy of the Hollywood Studios, Microsoft's Windows OS, or Detroit's Big Three automakers.

Update  Jan. 13: I've clarified Wu's term for these men and added a properly MLA-cited direct quotation from Wu to add emphasis to my point (and to demonstrate proper integration of sources to students). Note to students that the first citation of Wu does not need his name with the page number; I already made clear the source's name. The second citation does take the page number because the source is not yet clear.  The same rules apply for paraphrase.

Work Cited:

Wu, Tim. The Master Switch: The Rise and Fall of Information Empires. New York: Knopf, 2010.