Showing posts with label Microsoft. Show all posts
Showing posts with label Microsoft. Show all posts

Tuesday, May 7, 2013

Deformed: Virtual Worlds & 1970s Computing History

Mother of All Demos
Location: Watching Douglas Engelbart's "The Mother of all Demos"

image credit: New Media Consortium at Flickr

Many of us who have dabbled with virtual worlds have wondered how they could form a constellation of networked systems, in much the same way as the Internet's servers do today. Whatever the fate of this niche technology called virtual worlds, from the failure to run on mobile devices to the inherent boom-bust cycle of Second Life's particular brand, the road not taken always beckons.

There is an historical precedent here, and it's one that has a happy ending. Could the same be true for virtual worlds?

Today's Internet: Clarke's Law in Action

For a moment, consider the complex and delicate magic that occurs whenever we use the Internet. My university server talks fine to Google, for work such as the just-completed Usher project. Whenever the Outlook mail client randomly eats student file attachments sent to me, I smile. Ah, Microsoft's wonderfully Byzantine and wonderfully doomed, technology, eating even its own Word files. Cue Apple and Google, as the Ottomans on the horizon, slowly gobbling up a once mighty empire.  Good riddance.

Then, because of the lack of monopoly that Microsoft coveted and almost got, I have the students try, try again with Gmail. Excede's servers send me the results and, once I type, transmit my thoughts--profound or inane--from home travel via satellite to Google.  When I send notice of this post to interested folks at Twitter or Facebook, the servers hosting that data all "talk" to one another.

Types of Gardens and a World-Wide Web, 1975?

Compare that to virtual worlds technology, ostensibly part of the Internet since that is how we access it. Second Life, InWorldz, and many others that share core technologies could, in theory, speak to each other. Had development not branched off as it has done, Linden Lab and a few other grids might have pioneered a system for avatars and inventory to travel from world to world.  That happens with OpenSim Hypergridding, a technology that John Lester promoted, before his work for Reaction Grid turned to Jibe-based 3D worlds.  But "interoperability" died years ago at Linden Lab, and it seems unlikely to return.

It's curious, this set of walled gardens. If today the Internet resembles Borges Garden of Forking Paths, Virtual Worlds resemble something else: the road taken in the 1970s toward personal computing.

I realized that while reading John Markoff's What the Dormouse Said, a history of early computing got influenced by the American Counterculture:
When personal computing finally blossomed in Silicon Valley in the mid-seventies, it did so largely without the history and research that had gone before it. As a consequence, the personal-computer industry would be deformed for years, creating a world of isolated desktop boxes, in contrast to the communities of shared information that had been pioneered in the sixties and early seventies. (179)
The Internet did not begin with Al Gore, whatever he may have claimed. It did not begin in Jobs' family garage and with Steve Wozniak's brilliant hardware hacks. It did not begin at Xerox PARC with the Alto. The personal computer with a GUI and mouse? Yes, we can credit or blame Xerox and Apple for that.

But years before, nearly every element of the modern Internet would have been possible with the Augment system, developed under the leadership of Douglas Engelbart. Yet that development stalled and ended, a revolution stillborn.  I think we can see an analogue for what is going on, at this cultural moment, with user-generated virtual worlds.


Engelbart's Mouse

Want to see what might have been for the Internet? I am convinced that had something like Augment  been made less opaque for casual users, we'd have had an academic, and perhaps consumer Internet in 1975. Engelbart gave a show-stopper of a demo in 1968, with mouse, text-editing with clipboard and copy/paste, multiple files, graphics on screen, electronic mail, hyperlinks tagged to graphics, and remote visitors via a network.  You can see what he was doing with Augment at these videos from Stanford.

The reasons for Augment's failure are complex; Markoff's book does justice both to the creator's vision and his ultimate failure to produce a widely adapted product.  What happened, however, for consumers was the emergence of walled gardens and proprietary systems from Apple, Microsoft, Digital, Tandy, and other competitors forgotten except by historians of technology.

When the Internet emerged, it came late to a culture of desktop boxes that could not, generally, talk to one another.

What if the personal computer revolution had begun with networking? And similarly...

What if Virtual Worlds Had Begun with Interoperability?

I'm writing an article about one group of USENET hobbyists who have made the jump to Facebook, because the old .alt group proved too chaotic and full of spammers, trolls, and other bottom feeders. They also made the leap because, frankly, .jpgs and text import and export well between applications. Text did in Engelbart's day.

Little aside from these, plus Collada files and some other graphic formats, can move between different virtual worlds.  Standards for inventories, for avatar meshes, and for "land" templates are different. In this technology landscape, OpenSim grids serve as today's Augment. Managing a bunch of avatars and a region in OS is hard to master, not stable in my experience except in the hands of a pro, but interoperable. Running an entire campus-hosted grid would be lovely, but it's beyond my time or expertise to learn.

Other products with potential beyond SL's technology, such Unity 3D and Jibe, produce elegant worlds, but they don't talk to other worlds and expert designers need to craft objects. They do offer vast potential, according to OpenSim pioneer Adam Frisby, for scaling, running on mobile devices, and improved grahics.

That sounds great until one considers faculty skills-sets and what it takes to build with Unity or Jibe. As noted  before in this blog, developing for these platforms may be within reach for architecture and engineering students, but at my university, it's challenge enough to get students to juggle multiple e-mail accounts and embed files from YouTube into their blog posts. We faculty lack time and incentives to do more with them, let alone learn 3D applications such as Maya or Blender.  Yet nearly all of us at my school have created content, mostly with text and images and sometimes digital video, and shared it on the Internet.

For all its limitations and toxicity as a brand, Second Life and, lets amateurs with a copy of Photoshop build easily. I'm told that Cloud Party does too, and I will soon try again with Cloud Party's latest build tools. Scripting remains something for those not faint of heart and projects to make visual scripting tools, such as MIT's Scratch for SL, remain as stillborn as Augment.

What it will Take to be Disruptive

Here comes a sweeping generalization, and I'm ready to fall on this sword if some wise person can prove me wrong. Virtual worlds will never be a disruptive technology, in Tim Wu's sense of the term, until they become an interoperable and popular tool for everyday life, as the Web and e-mail have become.

Had virtual worlds begun with a series of collaborative academic ventures rooted in common standards, rather than a group of for-profit start-ups from The Valley, we might have that disruption and a 3D Web today.  Then the profits would follow, because in 1968, who could have foreseen eBay or Amazon or Facebook?

Right now, however, it's still 1968 and we've all seen the potential of a disruptive technology, as those who watched Engelbart's presentation did.

So today, who will build the 3D Web?


Saturday, February 2, 2013

The Phone Book's Here! The Phone Book's Here!

Location: Front Porch

Verizon's phone book landed on the porch a day or two ago, and only this morning did I get around to unwrapping the skinny little thing. It's like the Ghost of Phone Books Past, to this techno-Scrooge.

In another era, the arrival of the new edition would cause quite a flutter. Misplacing the phone book was also unthinkable. They were not easily or cheaply replaced, and my father guarded it like the Dead Sea Scrolls. In an otherwise chaotic Lebanese-American household, the phone book held court on the telephone table. Nowhere else.

Now phone books are incomplete and small. Verizon banishes residential listing entirely to a Web page, a crowning irony since most of the land-line stalwarts I know are so old that a broom provides the interface for web access in their homes.

Once, however, the phone book provided all sorts of diversions. I've three degrees of separation from musician Frank Zappa, but only one from his phone book. A grad-school friend named Rick, when living in Los Angeles, once had the job of delivering the huge directories to the homes of the famous and crazy in that star-studded town.

At one address, he was ringing the bell when a hippie "who looked like a madman" stuck his fuzzy head out of an upstairs window. "What the hell you want?" he challenged Rick.

"Phone books!" Rick replied, and the hippie's face brightened and he came running, shouting "Oh man! The phone book's here! The phone book's here!" Rick quickly found out he'd delivered Frank Zappa's phone book.  Lord knows what chemically enhanced games were played with the directory.  I know that my friends and I played geeky games of generating random names for role-playing games by picking, in the I-Ching manner, examples from the phone book and mixing examples from white and yellow pages. Names such as "Lorenzo Plumber" or "Scrap Metal O'Malley" resulted, to our geeky delight.

Now the I Ching is online, so I asked of it "What is the fate of the phone book?" With six casts of the stones, I received this answer:
"Waters difficult to keep within the Lake's banks: The Superior Person examines the nature of virtue and makes himself a standard that can be followed. Self-discipline brings success; but restraints too binding bring self-defeat."
History of technology there? Tim Wu's The Master Switch, the text just completed in my course on The History, Culture and Future of Cyberspace, is all about those who establish standards for an entire industry, like Bill Gates, though not just for themselves. Some geniuses such as Steve Jobs foresee needs we don't yet have, and they push others to and past the breaking point to make the vision real and the consumer's need materialize.

In the era of mobile technology, where most of my students carry area codes from faraway lands, there's no sense in a phone book. Discipline is needed, however; my students get so lost in a web of constant texts and other inputs that they do not give sufficient priority to e-mail about classes, and they suffer as a result.
 
So as be blunder forward, without much direction or a good directory, into this connected era I will miss the nigh-sacred tome on the "telephone table" in my, and perhaps Frank Zappa's, dining room. 


Or, perhaps, another room in his hippie mansion.

Wednesday, August 29, 2012

A Killer App That Killed iPhone for Me

Location: Verizon Store

image credit: screen capture from Google Streetview, on my campus near Westhampton Lake.


I'm not a gadget boy, but soon I'll make the shift from a flip-phone to a smart one.

And it won't be an Apple product. Why? Google Street View.

When Apple, for perfectly logical reasons, dropped Google Maps from their iOS 6 plans, they dropped me. I tend to by loyal--fanatically--to the Mac OS, and I'm a long-time hater of stodgy, backward Microsoft (Kinect and their fabulous two-button USB mice excepted).

As a tech user, there are some features that I refuse to abandon, and Street View is one of them. I have used it when traveling abroad to find a car-rental place or hotel that I'd never have found in the maze of York, England's medieval streets. Before Street View, I once walked to the wrong town looking for my car, just outside Bath. After Street View, I led a bunch of friends directly to the right restaurant in Istanbul, London, New York City, and San Francisco. My e-mail to Apple reads:
I am about to get my first-ever smart phone. My wife loves her iPhone 4, but without Street View being integrated into the OS tightly, Apple has lost me as a customer. I'm a 20+ year loyalist to the Mac OS, but I'm drifting away with iOS.

Just a word of advice to the geniuses at the Genius Bar: give me back Street View...now.
So yeah, it matters, Apple. I don't want to have to leap through three hoops to get Street View. I want it instantly, when I click on an address. The Phone app on the iPhone does that so well, with a "call" button in Google searches on Safari.

Instead of working with a competitor or launching a bunch of Apple vans to canvas and photograph the planet (how DID Google manage that?), Apple gives us a "flyover" view that won't help me find a restaurant or business from a human's-eye perspective.

Make my new phone an Android, please, Mr. Verizon Man. And if you show me a Windows Phone, my next post here will be written from jail after a headline reads "Professor arrested after stomping on crappy phone from stodgy company late to party where they never innovated anyway."

But Google? Bring  it on.


Friday, August 12, 2011

Grandpa's Box: My Thoughts on the Future of Our Computers

Location: Using Obsolescent Computing Tool

I don't know how many of my friends among Second Life's digerati are in contact with young people on a daily basis. I am, and I am stunned by how fast they are abandoning the personal computer.

The other day, I got a briefing from a Writing Consultant who works for me. She reviewed our Writing Center Web site and, while polite and praising the content, noted that the organization is all wrong for a student audience. "I no longer use my laptop," she admitted. "The iPad is my primary computer."

Much of our content, developed over many years and going through a vetting process with several bureaucratic levels, is all wrong.

It has happened with hyperdrive speed, this shift.  We are not an engineering or arts school: our students reflect typical affluent users.

Should I cheer at this funeral?

Iggy in the Confessional Booth, Before St. Steve


As a Mac-OS fanatic, I take no comfort in Apple's victory with portable devices. Windows-users, we are in the same boat, because the iPad is no Mac. Or Windows PC.

It's the anti-Mac, or better still, the final realization of Steve Jobs' dream of 1984: a sleek and closed-down platform with an elegant interface, but where one pays a price: Apple controls every damned thing. The iPhones and iPads are also hip examples of industrial design, just as the Mac of 1984 was no bulky (and sturdy) IBM PC of the sort I then owned. The IBM was for office-clones who had reluctantly given up their Selectric typewriters. The Mac was for artists and freaks.

It was a machine with a personality. Over time, it acquired a soul after Jobs' hammer-lock on hardware design was yanked away. Except for Extensions conflicts before OS X. But we won't go there...Jobs' "second coming" swept away the Old Order.

Jobs is a man with Olympian ambition and an insanely great idea or two: his ideal factory would take trucks of sand in one end and ship out computers at the other end. He wants to own the whole system. Henry Ford was smiling from plutocrat-heaven when he looked down at St. Steve.  Tim Wu understands this well in his book The Master Switch. Wu takes some well aimed swipes at Jobs.

Sorry, Steve. I really adore your OS, but I'm thinking of backing Google with an Android purchase. And--horrors--I think I'll be playing Mass Effect on an Xbox 360.

Confession #2: I'm going to buy that in several months, mostly to play the latest Mass Effect.  Let's give the Devil his due; Microsoft kept their corporate wet-blanket culture off the gaming division, and out of that we got the Kinect.

Trouble for Granda's Box

My "Nerd-Night" tabletop RPG group consists of one Mac-guy (me) and a bunch of Windows-based gamers who get away from their MMOs to go "old school" and roll some d20s. Some of them own console games, but not a one uses a smart phone or tablet.

They are dinosaurs as surely as I am. They don't get why the next generation of computer users are eschewing desktop systems for portable devices. I, on the other hand, get it.

Millennials want to always be in touch with their hive. They need self assurance and confirmation of their choices, and they do their social planning on the fly. That's impossible from a tethered desktop or even a laptop. I've yet to see more than a scattering of students use a laptop outside, as university promotional photos often show. Instead, they compute as they walk across campus. Give them data glasses that look just like sunglasses, and they'd use them too. Just don't make them stop, even for a nanosecond.

I recently had an epiphany that the wider culture is also getting it when I saw this Scott Adams' cartoon.

Dilbert.comMy buddies and I are Dilbert. My students, the young fella.

I don't know how Apple's closed system will fare against the Android OS from Google. Poorly, I secretly hope. But as a Microsoft-hater, I am also pleased that Windows will be the biggest loser of all. The company, except for its Kinect, has been no innovator in recent years. Was it ever? Steve Balmer has the cool-factor of Dilbert.

Whatever our desktop OS, I think we old timers with our grandpa boxes will look back at the System Wars of the 1980s and 90s with nostalgia. Like many other hobbies I embrace, from model-building to boardgames about World War II, the rest of society has moved on and I'm in this eddy of forgotten time.

I see a future in which content creators will use powerful computers in some form. The rest of the public--the consumers--will want to be close to the Machine. Machines as easy to use that they are ubiquitous, a part of our bodies.

It's what Sherry Turkle of MIT has called "always on, always on you" technology. Get ready for it.

But I still don't own a cell phone...my "dumb phone," a pay-as-you-go model, expired in April.  It won't be missed.