Showing posts with label cyberculture. Show all posts
Showing posts with label cyberculture. Show all posts

Thursday, March 28, 2013

Historical Precedent: Mobile Computing & Our Unease?

Location: In front of a large screen

image credit: U Penn Library Exhibit, "John W. Mauchly and the Development of the ENIAC Computer"

It's a common complaint that any mention of virtual worlds has ebbed in the popular media, and one reason given has often been the shift to mobile devices and tiny screens. Certainly that describes my students' preferences for online devices: about 90% of the e-mail I get from students comes from their phones.

I have met stiff resistance from colleagues wedded to desktop and laptop computers when suggesting that we need to make mobile computing the focus for our efforts with virtual worlds and more. For some historical precedent about this, consider an argument put forward by John Markoff in What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry.  In this, Markoff clearly realizes, as Tim Wu did a few years later in The Master Switch, that some technologies overturn entire industries and ways of communicating:
Indeed, the hallmark of each generation of computing has been that its practitioners have resisted each subsequent shift in technology. Mainframes, minicomputers, PCs, PDAs--at the outset of each innovation, the old guard has fought a pitched battle against the upstarts, only to give in to the brutal realities of cost and performance.
As Moore's Law makes our hand-held devices more powerful, I suspect this will happen again. For the latest shift, it will mean that something the size of a smart phone will be our primary computing device on the go or, when attached to  virtual keyboards and easily accessed monitors, nearly everywhere else. Here's a picture from the year 2023:

You enter your office and look at something like a large-screen television hung on the wall above the desk. You speak a login keyword. The phone, linked to the global data-cloud, remains in your pocket as you begin to work, using gestures in the air while in range of the television's scanner.  Windows for e-mail, a spreadsheet, and a calendar appear and you move them around with your hands while you issue voice commands. To input text you simply speak, and the voice-recognition software in the phone translates this to text. You finish just before a face-to-face meeting with colleagues, and walk down the hall. In the conference room, there's another big television, and with voice alone, you begin to talk. The notes taken in your office appear on the wall.

I will be a very late-comer to mobile computing, when I get a smart phone this fall. I don't fancy my iPad all that much, finding it must useful for quick browsing to, say, check the weather or read an e-book.  That may well change. For the longest time, Markoff notes, printing was one of the biggest hurdles for personal computing. When these puzzles get solved, such as providing big screens and input devices for mobile computing carried in a pocket, progress happens rapidly.

In a world with haptic and voice interfaces, as well as a robust data-cloud, we should get ready to say farewell to both desktop and laptop in fewer years than we might imagine. Then, imagine the students' gesture of neurotically clutching their smart phones to see as antique as clutching a magical talisman.

Saturday, February 2, 2013

The Phone Book's Here! The Phone Book's Here!

Location: Front Porch

Verizon's phone book landed on the porch a day or two ago, and only this morning did I get around to unwrapping the skinny little thing. It's like the Ghost of Phone Books Past, to this techno-Scrooge.

In another era, the arrival of the new edition would cause quite a flutter. Misplacing the phone book was also unthinkable. They were not easily or cheaply replaced, and my father guarded it like the Dead Sea Scrolls. In an otherwise chaotic Lebanese-American household, the phone book held court on the telephone table. Nowhere else.

Now phone books are incomplete and small. Verizon banishes residential listing entirely to a Web page, a crowning irony since most of the land-line stalwarts I know are so old that a broom provides the interface for web access in their homes.

Once, however, the phone book provided all sorts of diversions. I've three degrees of separation from musician Frank Zappa, but only one from his phone book. A grad-school friend named Rick, when living in Los Angeles, once had the job of delivering the huge directories to the homes of the famous and crazy in that star-studded town.

At one address, he was ringing the bell when a hippie "who looked like a madman" stuck his fuzzy head out of an upstairs window. "What the hell you want?" he challenged Rick.

"Phone books!" Rick replied, and the hippie's face brightened and he came running, shouting "Oh man! The phone book's here! The phone book's here!" Rick quickly found out he'd delivered Frank Zappa's phone book.  Lord knows what chemically enhanced games were played with the directory.  I know that my friends and I played geeky games of generating random names for role-playing games by picking, in the I-Ching manner, examples from the phone book and mixing examples from white and yellow pages. Names such as "Lorenzo Plumber" or "Scrap Metal O'Malley" resulted, to our geeky delight.

Now the I Ching is online, so I asked of it "What is the fate of the phone book?" With six casts of the stones, I received this answer:
"Waters difficult to keep within the Lake's banks: The Superior Person examines the nature of virtue and makes himself a standard that can be followed. Self-discipline brings success; but restraints too binding bring self-defeat."
History of technology there? Tim Wu's The Master Switch, the text just completed in my course on The History, Culture and Future of Cyberspace, is all about those who establish standards for an entire industry, like Bill Gates, though not just for themselves. Some geniuses such as Steve Jobs foresee needs we don't yet have, and they push others to and past the breaking point to make the vision real and the consumer's need materialize.

In the era of mobile technology, where most of my students carry area codes from faraway lands, there's no sense in a phone book. Discipline is needed, however; my students get so lost in a web of constant texts and other inputs that they do not give sufficient priority to e-mail about classes, and they suffer as a result.
 
So as be blunder forward, without much direction or a good directory, into this connected era I will miss the nigh-sacred tome on the "telephone table" in my, and perhaps Frank Zappa's, dining room. 


Or, perhaps, another room in his hippie mansion.

Thursday, June 14, 2012

Why We Keep Inventing the Apocalypse

Location: Perfect June Day, Not a Zombie in Sight

What follows are a few reflections culled from a longer non-fiction piece I hope to publish in Richmond's alternative weekly, Style. Lately, I've been thinking of our nearly pornographic interest in the End Times. I've noticed over the years in Second Life how popular post-apocalyptic settings have been. They are also very popular in games. And while I've not read scholarship on this subject, I wonder about the continuing popularity among college kids of Cormac McCarthy's The Road in both book and film (see image above) versions.

I tend to prefer stories of rebuilding and survival, such as James Howard Kunstler's "World Made By Hand" novels or the second Mad Max film The Road Warrior. These sorts of invented worlds are a minority: of contemporary doomsday TV series, it seems that only "Revolution" is about the urge to remake the world after it falls apart.

So what makes these bleak futures the current staple of Hollywood, computer gaming, and so much of printed science fiction?  With series such as "The Walking Dead" a sub-genre of SF has gone mainstream. It's a recent phenomenon, too. In my copy of Brave New Worlds: The Oxford Dictionary of Science Fiction, editor Jeff Prucher finds that decade gave us the earliest coinages of the terms “post-apocalyptic,” “post-catastrophe,” and “post holocaust.” Not by accident do those neologisms parallel perceptions that America had entered a gradual decline. As the Rust Belt shed jobs until Detroit very much resembled a set from “The Walking Dead,” I find it curious that instead of the positive escapism that something like “Star Trek” offers, we went dark and largely have stayed there. Even George Lucas got bleak in its trilogy of prequels, a story bleaker than even the acting of Hayden Christensen as the young Darth Vader.

That's as far as I've gotten. Do post-apocalyptic settings give us freedom we lack in day-to-day life? Let us imagine a clean slate and a new start? Or are they just fun as hell, McCarthy's jet-black work excepted?

Wednesday, April 18, 2012

How Virtual Worlds Changed a Bunch o' Educators


Location: VWER Meeting

I want to pull out a few choice remarks from a March Roundtable, where moderator Chris Robinson posed this provocative question by VWER member Birdie Newcomb:

How have you changed since using virtual worlds?

Beyond how some of us get called our SL names in real life, here are some reflections:
  • AJ Brooks: well – for me, I have a career because of it. I’m a published author, a seasoned presenter
  • Birdie Newcomb: I’m much more sociable here than in RL, though one influences the other
  • Frankie Antonelli: in my case, vws make me feel more connected as I don’t have to for annual conferences, better yet, I can continue conversations after the conference
  • Liana Hubbenfluff: well, I was exposed to other cultures I’d never thought I’d be hanging out with
  • Merlin Moonshadow: My experience in SL has made me more outgoing and more confident, brought out leadership skills that I never knew I had, stimulated my thinking and my creativity, and was a big factor in deciding to go back to graduate school.
  • Samantha Chester (sam55.chester): well vws have allowed me to collaborate with educators from all over the world and be exposed to ideas and practices I would not have in [real life] 
  • Pathfinder Lester: in my experience, I think everyone who spends a good amount of time in virtual worlds is fundamentally *changed* in how they perceive the world and people (including themselves). . . .I think that’s why the community of people using virtual worlds has a lot of binding energy…..not just because we share experiences…but because we all share a changed perspective on things.
Readers can surf over to the entire transcript here.

I think that Dan Holt puts one aspect of these changes very well, and he is supported by research into online writing communities from scholars like Cindy Selfe, Gail Hawisher, and many others who publish in Computers and Composition. Cynthia Haynes and Jan Rune Holmervik explored the expansion of self enabled by MOOs, MUDs, and similar in their book High Wired.

 Namely, Dan notes "Much of what you all say about expanding your community takes place as well with older tech like listservs, but the sense of being with other people is magnified tenfold when taking place within a 3D environment."

That's where I hope we'll have more discussions. There's something different about embodiment that is hard to pin down. Dan gets at it with his remark that "To me, [Facebook] really is only valuable for keeping loosely in touch with people I’ve known in the past. A VW is much more amenable to meeting with others you may never see in RL." Some VWERs disagreed, but for me that has been the Facebook vs. VW experience.

I'll also stand by what I said in response to Dan and a few others, "But all identity is constructed, says the lit-crit boy with the dreads and tophat."  Pathfinder Lester pointed us to the always savvy Peter Miller's "The Affective Context in Immersive Learning," and such good academic work may point the way forward to other studies of 3D applications and how they change us.

The pragmatic and snarky Claudia Rossini wins the commentary-contest award for her remark about SL and appearance, "the sexier cooler version is a desire formed by ads and tv, making fools think they need to be something they are not in order to be worthwhile."

Fools and their Linden Dollarss are soon parted? I suppose that's why I have so many of those mustaches and beards now! But then as I observed, in a moment right out of James Howard Kunstler, " SL cultivates a sense of the surreality of RL…I see more and more RL people as being like avatars…no more than inventory, good hair, and a dream home."

We are more than our hair, car, and house in either life. At least I hope so.

Tuesday, February 28, 2012

A Literary Close-Reading Of Oz Linden's Post on Third-Party Viewers

Location: Faculty Office, Already Nodding Off

Hamlet Au provoked his readers to crowdsource the meaning of Oz Linden's post on Third-Party Viewers. I am paid, in part, to teach students the art of literary exegesis. I have a spare 30 minutes on my hands...so here we go!

If you find this soporific, then it may help because we all need a good nap from time to time.
  • Oz begins with a moment of uncertainty, purportedly with his recording equipment: This resembles the existential ambiguity faced by Vladimir and Estragon, at the start of Beckett's Waiting for Godot. At the same time, this builds a dramatic tension essential to any literary work.

    Thus, even thirty seconds into the recording, we have an existentialist text, though we lack Camus' dead Arab on a beach or Bowles' tongueless madmen, wandering the Sahara.
  • "Four new clauses" added to the policy: this is classic theater. We have a play in four acts, but it remains to be seen if the drama will be tragic or comic in nature. Farce may be likely, given the history of earlier texts from Linden Lab.
  • That Linden Lab no longer distributes the Snowglobe viewer:  clearly a reference to Welles' Citizen Kane. In this landmark film, at a crucial moment, namely, his death, Charles Foster Kane drops a snow globe, redolent with memories of his childhood home that he lost, along with his innocence. He then went on to become a titan of his era's information industry.  Could the loss of this Snowglobe be a parallel moment for the utopian and youthful Lab, as new products take it into fame? Or portend a titanic fall?
  • LSL will only return true presence data: This marks a curious turn to New Critical hermeneutics, as popularized by the Well-Wrought-Urn school of scholars such as the Nashville Fugitives. One must not seek outside the object of art to find meaning. One must only consider the art itself.  While such a stance is consonant with Second Life as a walled garden, this reference could, ironically, foreground the inherent contradictions of Oz's message.

    Postmodern hermeneutics teaches us, however, that all works self-destruct. In Oz's case, ambiguity has mingled with "true presence." This, as I am about to show, descends into the realm of horror.

  • "It is a different bug" and "if it is not fixed, we will deal with it as a bug": We are back to Existentialism again, but with a surreal turn as we venture into Kafka's "The Metamorphosis." But who is to be the hapless Gregor Samsa? Perhaps SL educators, since we have been neglected or squashed, like bugs.
  • "The User can say anything they want...the viewer cannot do it for them." This is classic Reader-Response Theory at work. Oz notes that the user, like the reader, can make interpretations and statements not contingent of the text, or in this case, the viewer. Each reader must struggle on his or her lonely road to finding meaning, if any can be found, in the text. 
Thus, and in conclusion, Oz's text contains no hidden message identical for each of us.
At this point, I began to drool and fell heavily on my keyboard.

Rosebud.

Friday, January 13, 2012

US Citizens: Time to Act on SOPA and PIPA Legislation

Location: Real State of Concern

If you think the Digital Millennium Copyright Act hurt the Internet as we know it, I urge you to look at what the Electronic Frontier Foundation has to say about two bills currently before Congress: PIPA and SOPA.

Read more and write your legislators through the EFF here. This is sad and serious business: user-generated content, as we all know it, could just vanish.

Sunday, June 26, 2011

Treasures of an Arcane and Monastic Craft: Scholarly Editing

image credit: Encyclopedia Virginia site, University of Virginia. Fredson Bowers, standing, works with Matthew Bruccoli  at a Hinman Collator.

This entry began as a reply to a recent post by Tateru Nino about the preference for physical or e-texts.

Somehow, I stupidly fumbled my posting and it vanished into pixeldust. That is, frankly, an apt beginning for this post, where I state my preference for reading physical text. Now, with a few days behind me, I've had more time to think about why.

For a brief time in graduate school, I considered heeding the siren-song of textual editing, a small but valued part of academic publishing. My mentor and PhD advisor at Indiana, Professor Don Cook, was a noted scholar in this field. Along with a band of similar-minded editors from universities, Don worked on editions of the works of William Dean Howells and other American authors. Sanctioned by the Modern Language Association's Committee on Scholarly Editions, the CSE imprimatur of "an approved edition" meant that these books were felt to represent the best possible edition available.  By comparing a proposed edition to an author's corrected manuscript or perhaps a first edition, the new text would embody, as closely as possible, the words and arrangement an author wanted.

Debates raged about what to do when someone like Scott Fitzgerald rearranged Tender is the Night completely, making a new version that many readers detest. Which text do we follow? Scribner's first edition or Fitzgerald's rewriting? When new editions appeared and an author assisted, how to tell which changes reflect an author's work, and which that of a lazy typesetter?

A Life Defined by Books and Bookishness

In the end, I chose another path. I am an awful and easily bored proofreader, racing on to the next paragraph and, only later, coming back to tidy up the mess I have made. Yet the crystalline purity of a CSE edition has always been a strong lure as I've built my personal library over the years. I'm also the sort to hunt down a good hardback copy of a book I love and then donate the paperback I used for teaching or that was my first encounter with a life-altering text.   Let's see what those might be...pressed to answer right now, I'd cite Toole's A Confederacy of Dunces, Adams' The Education of Henry Adams, Abbey's Desert Solitaire, Cather's Death Comes for the Archbishop, Eco's The Name of the Rose, Wharton's The House of Mirth, Bowles' The Sheltering Sky, O'Connor's Wise Blood, Faulkner's The Hamlet, and anything by Dutch writer Cees Nooteboom.

Not a one, except Nooteboom's shorter travel essays, would feel right on a screen. I like my "reader's" copies with my marginalia, as well as my second copies which I return to reread for pure pleasure. Most of the latter are hardbound. Blame this fetish on my work with scholarly editions.

Perhaps we can enter a world in which a fine printed edition can coexist, as a luxury item, alongside a reasonably correct e-text. Don Cook predicted that book collecting would become ever more of a niche activity but it would be enriched by computers. He envisioned luxury editions printed to order, for under $100, on great paper and perhaps with a choice of illustrations.

Monks in The Alderman Library

The work of scholarly editors continues electronically and without the bulky collator, but there was still a wonderfully Medieval feeling to the craft in the 1980s. Before tablets and dedicated e-readers become mass market consumer items, I had the sense that this arcane craft was already waning, though its practitioners remained tenacious, rather like the monks in Miller's A Canticle for Leibowitz. Once in the Alderman I peered over the shoulders of a foreign ambassador and his wife as a curator of rare books showed them Faulkner's carefully typed and corrected (his handwriting was very precise) manuscript for The Sound and The Fury. I love Faulkner's work for two reasons, first because he invented an entire fictional world; hence the relevance to this blog--Faulkner was a consummate world-maker. Second, he was the first writer who challenged me to read and read again, each encounter taking me deeper into a landscape that, while Southern, was nothing I knew. Yoknapatawpha County might as well have been Burroughs' Barsoom.

So seeing a Faulkner manuscript was like being shown the Shroud of Turin.

There's a romantic idea that the types of projects with such treasures transcend the "and now this!" culture of online communities. Neil Postman coined the term for the culture of television, but in fact it's more true than ever for the torrent of Tweets or Facebook status updates. In fact, the meditative work of comparing editions and manuscripts to remove errors that creep in is an act of passive aggression against the very spaces such as the one I use to compose this text.

It's with a bit of a shiver that I look at the photo of Bowers, a senior colleague of Don's, and Matthew Bruccoli, a contemporary of Don's for whom, many years ago, I wrote a few short bibliographic entries for a scholarly reference work. I know exactly the section in the Alderman where they are working.

I studied there as an undergrad, more than 30 years ago, and in 2002 went back to that part of the library to work on my own editing project of Antebellum Southern Humor, The Spirit of the Southern Frontier (have a look, but it's a bit of a ghost town). I was able to teach with the site twice, so I think the grant money was well spent, and the student who helped went on to her own PhD. There is a great feeling of "standing on the shoulders" of, if not giants, one's intellectual ancestors when working in such a place.

For a while at least, this aspect of monastic life continues. As our culture, the part that cares about serious reading anyway, passes from difficulty to difficulty, I think that we'll end up thankful to men and women who spent many quiet hours preparing good editions.

Monday, May 2, 2011

Digital Story: Stop Cyberbullying

Maddie's story was the people's choice in last week's competition.

I'd not followed this story and, despite a lifetime of being jaded, am amazed at human cruelty. There's some cold comfort that the perpetrators could get prison time for doing what they did, though even there the penalty may be less if a proposed settlement of the case becomes reality.

Whatever happens, on a day when some very different justice was done in Pakistan, I'm reminded that most of the time, evil gets its comeuppance.

Friday, February 18, 2011

A Look Ahead: Virtual Worlds in Aerospace & Defense

Virtual Worlds educators Roundtable 3 Feb 2011 
Location: VWER Meeting

image courtesy of Sheila Webber's flickr photostream

Back on Feb. 3, were pleased to host two Greg Moxness & Charles O’Connell, technologists from a major US defense contractor, who spoke at some length about their predictions for virtual worlds entering the mainstream. They were not speaking in their role of company employees, but they spoke knowledgeably about how technological advances might reshape 3D immersive environments.

I'll summarize some of their points below. You can read the entire transcript here.
  • Charles, on convincing coworkers of the value of virtual worlds, “Seed the young with ideas, soon become the decision makers or at least influencers–took about 4 years.”
  • Charles on developments to come “not sure military or defense is leading in this case. [Advances] more from commercial spaces, gaming and entertainment.”
  • Greg, on near-term advances: “the whole idea of gesture recognition and 3d worlds this could be this year or next”
  • Greg suspects we’ll see “full body haptics,” and Charles notes “Haptics–likely to be involved because it has such high value. [It's] never all or nothing. 2D and 3D will exist together….documents and spreadsheets along with 3D objects”
  • Greg on neural interfaces like those in Gibson’s Neuromancer: “[M]aybe a step too far. . .maybe 20-30 years but will the human become less and will the machines evolve?” Charles: “a key thing that might happen, if it can be done noninvasively, something outside the body that can monitor brain waves, nerve impulses.” (Iggy’s note to any student readers: from Anderson’s novel Feed, that is the early version of the Feed interface).
  • Greg agreed with the following remark by Charles, about the relative merits of 2D and 3D environments for training: “3D has immense possibilities, not an either/or question. Use 2D when better suited, or good enough. 3D [is for] experimentation or experiencing things not possible for some reason in RL.”
  • Greg on an advantage of virtual worlds, the need online for something approximating face-to-face contact. Charles notes his belief that “relationships are much stronger in VW.”
  • Charles also came out in favor of transparency in avatar identities (if not appearance) noting, “Treat people with respect, it’s a real place. One life, not two. It’s probably best to be yourself when dealing with others in VW.”
I look forward to their returning to the Roundtable in 2012.

Monday, February 14, 2011

Do My Students Need a 3D Web?

Raph Koster and old UO headline, Sony Online Entertainment, San Diego 
Location: Certainty

image credit: Raph Koster, of Ultima Online, Metaplace, and more, via Cory Doctorow's Flickr Photstream

"No," seems to be the uniform answer. The reasons say a great deal about the directions in which virtual worlds may not evolve. I put the question of "why haven't we gotten something like Gibson's immersive Matrix?" to my first-year seminar class.

I'll paraphrase the answers the came back:
  • Immersive engagement is best saved for when it is worth the extra work / software / time
  • Students prefer easy applications done "on the fly." In other words, they don't need an avatar to check the weather or send a short text to a friend or a relative
  • The less hardware needed, the better. Any rig like Case's would be tedious to use and hard to carry. An iPhone or similar fits into a pocket.
Would my students use a 3D experience? The answers here are complex. Yes, this group argued, for immersive gaming.  I don't know that current levels of virtual-world technology, with so much user-generated content, will ever enable that level of immersion. At best, they might make work for a class more fun.

We should look to other types of game-environments if we want something akin to Simstim or Case's rig. As I'll report soon, two technologists from a major defense contractor who spoke to VWER recently argued exactly that.

Will those emergent forms of 3D engagement replace our 2D Web? If my students are correct, no. It would, however, open worlds for gaming and for meetings, an ironic realization of Castronova's thesis that work and play will merge in the decades ahead. 

Tuesday, February 8, 2011

Communications Self-Analysis: A Blogger's Day

Location: Doing That Blog Thing

My students in the first-year seminar are spending one day tracking how they use a form of collaborative communications technology: something like Facebook, texting, telephony, or even...blogs. I told that that "fair is fair, so I'll track one of my days, too." The interesting aspect of this post is that I don't know what I'll find. How much time do I spend reading and writing blogs?


This is not me by any stretch. I'm reminded of the American version of The Office, where boss Michael Scott tries his best to be relevant. And he fails.

Yes, sometimes blogging does seem like a massive waste of time, doesn't it? Yet the technology also offers students an easy-to-master way to create multimedia projects.  It's a shift I'd argue that faculty must make in higher education, in order to remain relevant (or merely employed) in a time when state employees, including faculty, will increasingly be called to task to justify their work. I doubt that those of us working for private institutions will fare better, if we get a reputation for not using the literacy tools that are common beyond the gates of our cloistered campuses.

So how does someone who blogs spend the day? Here's my timeline for a day when I have some free time.
  • 8 am: checked the following virtual-world blogs: New World Notes, Dwell on It, and (non-guilty lowbrow pleasure) The Alphaville Herald.  No comments made.
  • 815 am: Checked my blog list here at Blogspot's dashboard. Read Dio's new post at The Ephemeral Frontier. I noted that she uses the term "meatspace" to describe the real-life profession of a boat-builder in virtual worlds. More echoes of Gibson.  Left two short comments for her. Read a much-deserved pan for the awful "Spider Man" musical at the NY Times. Left a sneer of my own about how stupid popular entertainment has gotten. It seems that many stories at the Times are merging with their blogs. Will there even be a difference in a few years? Blog-review done by 8:30 and I began this post. Time for my daily writing for me (not on a blog) and then, off to work!
  • 11:00 am: While riding the bus, I finish reading an article in Cees Nooteboom's wonderful anthology about travel, Nomad's Hotel. How on earth could Nooteboom do that in a blog? I decide he could not. He is such a talented and subtle writer. I hope he wins a Nobel Prize before he leaves this world.
  • 1:30 pm: After checking e-mail (and answering some blog-related questions from class!) I return to student blogs after a couple of days' absence. The rewrites of the first semi-formal project look promising, and I find myself reading all the other posts, since the students expect an estimate of participation grades.
  • 2:00 pm: I avoid the temptation of posting a comment to New World Notes, since I don't think I have the facts straight on a copyright issue from the early 1980s and don't want to look misinformed. Back to student blogs!
  • 5:30 pm: Getting ready for a night class, I could not resist leaving another remark at New World Notes.
What have I learned from tracking a day of blogging? The technology has kept me in a web of contacts, be they students, writers of other blogs, or readers who comment here. That was not easily possible for a writer before this technology blossomed.

I'm mindful now of Hannah Arendt's quotation, “For excellence, the presence of others is always required.” My work as a blogger or teacher is not necessarily excellent, but working with and in response to others has sharpened my skills and blunted any delusions--deserved or not--of excellence.

These technologies should put the lie to anyone who claims that writers work in garrets these days.

Monday, February 7, 2011

Battlestar Galactica in Second Life: Fair Use Permitted

Hearbreaker's New CO
Location: Viper Cockpit (I wish)

hat tip to Hiro Pendragon, for announcing this on the SLED list

image credit: Syr Villota's Flickr Photostream


Some firms understand that an enthusiastic group of customers can extend their brand, if those customers are permitted to play with intellectual property.

That's the case today with NBC/Universal, who had previously sought cease-and-desist orders against Battlestar Galactica roleplayers in Second Life. Now, according to the story in ArchVirtual, fans can again share BSP materials for non-commercial use.  I wish the company had done so earlier. A few Galactica-themed sims have now closed.

There's a long-standing precedent for letting fans just be fans and play: Paramount long permitted Star Trek fans to create derivative works. With the board game Star Fleet Battles, an old chestnut that is still around (I gamed it 30 years ago!) Paramount even granted permission for a commercial product based upon elements in the original Star Trek series.

One could easily argue that these fans kept the ST franchise alive long after the cancellation of the original series. Now that BSG's run on TV has ended, for a time in any case, perhaps that franchise too will continue to thrive through the work of fans.

I do hope that Frank Herbert's estate and others who have slapped around fan communities might actually talk to those fans, before crying havoc and letting slip the dogs of law.

Wednesday, February 2, 2011

Neuromancer Thoughts: Both More and Less Than People

Hangars Liquides
Location: Scarlet Tiers of The Eastern Seaboard Fission Authority


Image credit: "Hangars Liquides" by Ka Rasmuson at Flickr

Finishing Neuromancer for perhaps the fifth or sixth time, I am still stuck with a question that I put to my class earlier today:

What would motivate you to merge with a machine? To put in a neural  implant so you could interface with data as surely as Case?

Gibson projected doing that with electrodes glued to our scalps, something that seems as quaint to me today as all of his mentions of magnetic tape in what may be the year 2030.  We won't need electrodes if we ever do develop a brain-hardware interface: we know a great deal more today about neuroscience than we did in 1984. Over at New World Notes, Wagner James Au occasionally reports on interfaces that permit the blind or paralyzed to manipulate data.  In a silly way, his recent post on a novel use of Xbox Kinect shows that the drive to merge meat and mind online hasn't abated.

I suspect we will make the technological, neurological, and moral leap one day to do far more. On a bad day, when I'm very tired, I think "well, I'm glad I won't live that long." On better days, I hope to try something like that, if only as a "tourist."

On the other hand, there's a danger with any sufficiently advanced technology: it might make us think we are gods who work magic. That's the dark corollary of the third of Clarke's Laws for you.

I don't know if my students, many of them having had their heads spun round by this important and very confusing book, understand that this novel reaches for a big theme. Gibson wants us to ponder a few things it seems:
  • What is "human"? 
  • What do we lose as we gain power through cybernetic prostheses?
  • Would we take the chance to become immortal if we could? Would we dare NOT take it?
Near the end of the novel, Case refuses Neuromancer's offer of immortality, a space where Linda Lee still lives, or thinks she lives, in what amounts to an event horizon inside the AI's self.

We may all live to know if such an offer awaits us. Before writing this post, I never realized that Arthur C. Clarke postulated three laws. The second is worth noting here:

"The only way of discovering the limits of the possible is to venture a little way past them into the impossible."

Gibson's fiction, always venturing past those limits, will retain its cultural significance as the rest of us follow in his wake.

Monday, January 31, 2011

Four Years in Second Life. What Now?

Location: Montclair State University Virtual Campus

I've been setting out virtual furniture for a group of technologists from a major US aerospace & defense contractor who will be speaking at the next VWER meeting. Their firm is doing some amazing things with virtual worlds, mostly in OpenSim but also with technology that William Gibson would have predicted in Neuromancer : exoskeletons, VR rigs for antiterrorist training, and more.

Meanwhile, consumer-level VR interfaces make halting but continued progress. The Wii and Kinect are first steps to a future when such technology could meet  changing norms about being in virtual spaces. Even partial immersion "creeps out" too many of my peers and students. But what if, as Edward Castronova claims in Exodus to the Virtual World, norms slowly begin to change? Then we might experience something like Gibson's Matrix or Stephenson's Metaverse.

I don't plan to be in-world 24/7, but I'd like the option to be a tourist in such spaces, from time to time.

Linden Lab has been an early pioneer here, and though I've said intemperate things about them in the past couple of years, it's been more out of disappointment for what might have been. Perhaps we'd have all been better off without the media-storm in 2006 and 2007. It made us dream too big too soon. Now, however, that early optimism has changed to a balance of weariness and hard work. My colleagues are divided on the future of 3D immersive worlds. Some suspect they'll be a niche forever. Others, citing trends among younger Millennials, claim they will make the revolution to build a 3D Web happen.

I hope so. Though I like the older Millennials I teach, they are too serious and career-driven to start a revolution, and they lack enough experience with open-ended play to imagine one. They don't turn off their hive of social networking long enough to look deeply inside. They are always in a hurry. But they are nice kids. I worry about them when they get older and life throws them some curve-balls.

I hope their younger siblings show us all a few new tricks.

Whatever happens, as I look back over four years, my colleagues, friends, and I have been pioneers on the edge of what is possible with our computers. I don't feel like an SL newbie any more.

Are SL years like dog years? I blew through my rezzday without realizing it. I'm sorta over that.

Am I feeling charitable or could LL still be masters of this new reality? If not, they sure as hell gave us a good ride at times. I hope the new CEO can keep the torch lit and the ride will continue.

Saturday, January 29, 2011

Living in Gibson's Future? Close Enough to Be Scary.

Location: Midpoint of Boston-Atlanta Metropolitan Axis
image credit: Barclay Shaw

The students in my first-year seminar may not understand, for a book published before they were even born, how William Gibson's Neuromancer not only shook up the genre of science fiction but also helped spawn a revolution in how those building the Web and its applications regard information

When I created a Second Life account in 2007, I left it an open question, in the "first life" tab for the avatar, as to whether Gibson's future would be realized in SL. Now I have the feeling that my question needed a broader scope: has Gibson's future been realized in both the worlds of, to use his dichotomy, meat and mind? If anything SL tries hard to be Gibsonian but fails: no one is as immersed as Case is in the novel, nor, for that matter, Molly and Armitage in their own part of futurity.

Mostly, however, Gibson got things close enough to scare me. As Egypt's government shuts down Internet access and cell-phone services, I recognize an eerie parallel to Wu's idea of a Master Switch that did not exist in 1984, when Neuromancer appeared. Gibson envisioned a more open Net than we have or that Wu believes we might get, but Gibson conceded that the "spiral arms of military systems" (52) would be forever beyond the reach of loners like Case.  They still are.

Yet using free public software, a million disenfranchised loners have united to topple Egypt's government, and at the time of writing this they may get their wish.  The keepers of those spiral arms, the Egyptian government, like their counterparts in Iran a few years back, know that their foes need cell telephony and Facebook to organize.

The rest of the world can only sit back and watch the chaos and, in Saudi Arabia, the government may be patting itself on the back for shutting out Facebook not so long ago.

But if Gibson were actually prophetic, then other aspects of his future should be closer to our door than the streets of Cairo.

Zaibatsus and Archologies

Put a Fuller Dome over many exurban gated communities in the US and Western Europe and you'd have the residential component of Neuromancer's way of life for the megacorporations: happy corporate employees going to their desks and conference rooms to plan and build a world to maximize a company's return on investment. Shopping and entertainment are only at a small remove, along the suburban sprawl, from the home cocoons. Only cheap oil has meant that workers drive to various destinations, instead of living and working and shopping together in the same place.


What would an arcology look like? There is one very Gibsonian building at the Web site of Yanko Design.  Called NOAH (New Orleans Arcology Habitat) it evokes the end-times of a next Big Storm and the sense of heroic futurism that the megacorps might project in a post-governmental world.  The comments on the proposal fascinate me: techno-dread and triumphalism characterize the remarks.

Following the logic of governmental minimalists and libertarians in the United States, whose voices are loud these days, a future where self-defining communities evolve into self-governing ones may not be so far-fetched. At the consensual level, it may already be as close as one's neighborhood covenant.

But the coming of Corporate Man (and Woman) has concurrently meant an increased stratification of wealth in US communities. Bruce Sterling once quipped that the 1980s dawned not only as the first American decade that seemed like science fiction but also a return to a neo-Victorian world with social-Darwinist conditions for citizens. How did that vision, crystallized in Neuromancer, pan out?

What follows is less critical than it might seem. In fact, as I'll note, I'm not sure it's ever been otherwise, except for its sheer scale. In 2007, the 1% of American citizens controlled 43% of financial wealth and 35% of net worth. The title of the report where I gathered this, "Who Rules America," gave an ideological slant that made me a little doubtful about the lack of bias by Professor Domhoff, so I found another source.

The US Census Bureau's report,"A Brief Look at Postwar U.S. Income Inequality" paints a picture not unlike Dumhoff's:
The long-run increase in income inequality is related to changes in the Nation’s labor market and its household com- position. The wage distribution has become considerably more unequal with more highly skilled, trained, and educated workers at the top experiencing real wage gains and those at the bottom real wage losses.
There's more at the Census Web site. I become a little less convinced, then, that Dumhoff is riding a hobby horse of leftist angst. One should pause before numbers like these from Domhoff: "Of all the new financial wealth created by the American economy in that 21-year-period, [1983-2004] fully 42% of it went to the top 1%."

Even so, the distribution of wealth before 1983 was not one of which I'm proud, given my belief that a large and politically active middle-class of property owners, such as all those returning GIs in 1945, can be an excellent shield against the sort of governance (and lack of it) Gibson portrays.   Citizens like the GIs who built the neighborhood were I now live, of modest ambitions and demeanor, have historically been  eager to protect their families and thus they've supported strong policing and orderly neighborhoods. Except for the turbulence of 1968, America has never had Egyptian-style chaos in the streets: Egypt has no middle class any longer, by all accounts.

Sadly, after 2008 and the financial meltdown, many more Americans have been thrown out of our middle class. Each foreclosure and failed company could take us a step closer to a neo-Victorian form of income inequality that characterizes Cyberpunk's view of the future.  And I doubt that all of the millions of dispossessed will leave their bank-seized homes peacefully. 2011 may be an interesting year.

Console Cowboys

But housing, like travel to Case, is a "meat thing." On the mind front, the lone cyber-cowboys of Gibson's world have not emerged. Wikileaks' revelations took a team of dedicated "hacktivists" to accomplish. At the same time, another aspect of current hacking comes straight from Neuromancer.

The cyber-attack on Iran's nuclear installations may have set back the nation's bomb-building program for some time. In a world like Gibson's, a zaibatsu might have stopped a rogue nation, given the failure of the US and Soviet governments after Operation Screaming Fist.  In our actuality, Israel and the US and who knows else might have launched the attack.  Thus we've moved further along than Gibson's world, technologically.

The Soviet computer centers Armitage mentions had to be attacked with Special-Forces hacker-operatives sent in on microlight gliders. In our world, Iran's systems are tied to the global Internet. So are ours, and both China and the US are quietly jockeying, behind the scenes, to develop cyberwar teams. If our nations ever go to war, Internet services and all its dependent systems--power grids, air-traffic systems, banking--will become legitimate domestic targets.

Street Samurai

On the "meat" front, our Molly might be named "Mike."


Burly men doing legitimate (and, at times, dirty) deeds for whatever Blackwater is now called, and firms like it, does not quite have the sex-appeal of Gibson's razorgirl.

At the same time, for this reader it would have been impossible in the early 80s to imagine Mercs, and that is what all "private security contractors" are, Mercenaries, enjoying the cultural acceptance they now enjoy. Many of my fellow citizens get a bit queasy on this subject, but for the past decade the US government has relied on Mercs to do jobs we do not wish to task to our armed forces or that require manpower we simply do not have in uniform without drafting college-aged kids.

In the 80s, except for Soldier of Fortune readers, the use of mercenaries by our government would have stirred widespread outrage. Now, if I may risk a generalization, with a 24/7 new cycle the attention of the public appears to move faster than it did 26 years ago, to the latest tragedy, scandal, or political circus.  Meanwhile, private security companies have limited accountability to US or international law. They are true Gibsonian figures of the "interzone where art wasn't quite crime, crime not quite art" (44).  Put in "law" for "art" and you have our situation.

Meanwhile, poor Molly. I don't think Gibson meant her to become a meme, but she has.

I'm glad we don't have razorgirl mercs walking the night cities of our world, but Molly can be found elsewhere: she's the badass sex-object of a hundred video games and action / adventure films. Hollywood understands male lust for these tough babes with guns and blades. Molly scares and attracts the boys, and I'm sure that more than a few male readers empathize with Case when he gets to enjoy her pleasures.

Naturally, one recent attempt to cast a film based upon the book would have featured Milla Jovovich as Molly: Resident Evil's killer star would get to put on the mirror eye-inserts. It's that way in any number of films or games where the strong woman is just as much a sex object as any gatefold playmate.  As my students will see when they get further into the book, there's that dark and disgusting desire out there to have a "meat puppet."  That technology is perhaps the most frightening of the book and, like organ-markets and stim-sim sets, it remains possible but not part of our lives.

We have Microsoft as a multinational, not microsofts that one puts into a skull-plug to learn a new skill. And the best we can do for Stim-Sim is the current 3D craze. Merge it with a Wii or Kinect, and we may approach something like watching through Tally Isham's eyes in her latest episode.

Better Living Through Bioengineering?

When students on campus trade Ritalin to do better on exams, yes, we live in a cyberpunk world. My students are afraid of our "street" and, whipped up the ladder by well meaning parents and educators like me, do their best to avoid it whatever the emotional cost.

At least, if any of us do wind up on the streets with between two and three million American homeless, we do not have to cope with illegal dealers in pituitary glands or black clinics in the worst part of town, ready to roll us for our kidneys.


In scope if not depth, Gibson was spot on about the culture of body modification, when teens have elective cosmetic surgery and their parents get Botox regularly. A single earring was a novelty on a man in 1984, and potentially a dangerous one. Now tats and piercings are common.

A dermatologist, who helped my mother with a degenerative eye disorder that required a monthly Botox injection, told me he'd given up on really iffy surgery that could land him in court. Instead, his income came mostly from vanity, that ready fix of Botox to keep the bathroom mirror lying to his patients every morning.

Verdict: Close Enough to Scare Me

Image Source: Unknown

In the end, we don't, and probably never will, have orbiting colonies of Rastas or French spas run by a maniacal and inbred family. I doubt that our planet has enough fossil fuels that are cheaply available to sustain enough wealth to make dreams like Richard Branson's, of cheap space tourism, happen. If I'm wrong, I'll be too old to get the ride to orbit I've wanted since I saw a Gemini launch in the mid 1960s.

In other regards, however, I'm happy to say that Gibson got a great deal wrong. May it continue to be so, because what he predicted correctly has been frightening enough.  Ours is a harder-edged world than it was in 1984, when the Cold War raged on.  And the technologies of interconnection empower lots of individuals. Al Quaeda, not the Panther Moderns, uses the Internet but the violence engendered is still the stuff of nightmare.

So, Bill, thanks for all of the bad dreams. You really invented a genre and, in some ways, a world.

Work Cited:

Gibson, William. Neuromancer. New York: Ace, 1984.

Monday, January 17, 2011

Apple Without Steve?

Location: Reading Tea Leaves

Apple's die-hard faithful probably let out a collective moan today. Steve Jobs' decision to take medical leave sent shivers through not only computer users but also the entire stock market.

I wish Mr. Jobs a speedy recovery and a long life. While the man has been accused of draconian management practices, he did help usher in the personal computing revolution. Even if Steve Wozniak was the engineering genius behind the early Apple computers, Jobs' marketing brilliance made graphical user interfaces and the Cult of Apple happen.

It was a different sort of company from the upstart PC makers of the early 80s or the stumbling and arrogant behemoth that was IBM in that decade; if the competition consisted of geeks, Apple seemed run by artists and madmen of the sort I enjoy meeting for drinks.  I could imagine Apple's people chatting up the fire-breather and magician in San Francisco's Vesuvio, one of planet Earth's best bars, after shopping for poetry books at City Lights. IBM's folks would be (massive yawn) golfing with senators. Microsoft's people would show up at the bar and try to look hip, all the while looking like khaki-and-polo-shirted  tourists.

But I digress: my preference for bohemians over boring business and I.T. types is not news, but perhaps this is exactly how Apple fooled chumps like me into thinking the firm hip and not merely a very clever marketing idea wrapped around some elegant and useful technology. It sure snared cultural creatives as fanboys, to market the company's products to friends. I'm sure I convinced a dozen folks to try the Mac OS over the years. Not a one screamed at me later, a testimony to how well our Macs did, and do, work.


More Than Macs: How Apple Did "Think Different"

The history of the firm relates quite well to ideas from my class on the history of Cyberspace. Jobs will be remembered, when he leaves this planet to compare notes with John D. Rockefeller and Howard Hughes, in one of two ways.

First, Apple's CEO might be what what Tim Wu in The Master Switch calls a Defining Mogul.  That will be the case if Apple's closed system for handheld computing comes to dominate the future of Internet use.  You will get what Verizon, AT&T, and Apple decide you get to see or do. And I agree with Wu's well supported "central contention....in the United States, it is industrial structure that determines the limits of free speech" (121). I suspect this revelation about the first amendment applying to Congress, as in "Congress shall make no law..." and not to employers or ISPs, surprised my students as much as it did me.

On the other hand, if an open system dominates, Jobs might be recalled as a would-be monopolist who failed, despite a miraculous comeback in the late 1990s. I'm not sure which future I want. I used to fear Microsoft. I was hoping Macs would remain boutique systems for picky people like me. I also prefer German cars or pre-1973 Detroit Muscle. They are fancy fun, but they both run on our roads with boring cars. If my Mac could work with the sea of boring Windows boxes, what harm that?

Now, however, I am beginning to fear Apple.

My ownership of a handful of Mac systems since the dark years of the mid 90s shows how well they hold up. Our household iMac G5, now six years young, will soon enjoy a long retirement as our DVD player / media box for downloading films. Some child in Alaska is probably still using the G3 iMac I shipped up there, years ago, after an eBay auction.

Inside the latest iMac, however, there's a story that goes to the heart of Jobs' vision for Apple.  Unlike its predecessors, the new iMac's CPU has no user-serviceable parts aside from upgrading the RAM. This has long been Jobs' favored tactic. If the Apple II of Wozniak's day invited tinkering, the first Mac in 1984 became its antithesis: to open the case was the void the warranty!

Apple has played this push-pull game with its hardware since that time. As a tinkerer, I found this exasperating as my friends using Windows or Linux would build computers out of spare parts and make them do cool things.  Nowadays, however, we old geezers with desktop boxes are giving way to the iEverthing generation, who carry their devices around incessantly and don't tinker. It's about the app, not the device, to these kids. And Jobs--who not only could recognize but also create markets--knew this best of all.  The Mac's established niche among school teachers and artists and graphics designers and video professionals and crazy professors is, after all, somewhat limited. Apple did not even try to top Dell's game in Henrico County, where the middle schools will give up their Macs soon for PCs, just as the high schools have done. Though the Macs last longer and are more robust when dropped by little kids, Dell offered a better price on service. Apple declined, I suppose, betting that kids with iPhones and iPads will eventually want Macs, anyhow. Or that iPads might replace laptops in schools, and with the right Verizon contract Apple would best anything Dell could offer.

The educational discount, I just noticed, for the iMac I am considering comes to all of 100 bucks. It was once twice that. Second Life folks...does this sound at all familiar?

This reality suits Steve Jobs fine: his portable devices, selling in volumes we Mac zealots could only dream of in the late 1990s, are closed in ways Job could never manage with the Macintosh in any form.  We'd pry them open.

Try that with an iPhone. Maybe you have better eyesight than I do. Besides, even if  you do that, Apple's revenue-stream will switch from selling you a $2000 iMac (high end: I like my computers pimped) to selling you a $200 phone with a long-term contract that includes a cut for Apple with each App you buy and, I'm guessing, a bit of the monthly take that goes to the ISP carrying your data.

That's a sweet deal for all the corporations involved.

Consumers might not care, as long as the service is reliable, but as with AT&T before the 1984 breakup, with the iOS Steve's firm can dictate which apps appear, or do not, on the system. The Mac OS lets you write software; its relative openness, based on its UNIX underpinnings, compared to Windows has been great for those who had the skill to develop applications.

But What if Steve Loses (Again)?

Jobs lost the desktop market to Microsoft because he could not see what Bill Gates saw: that the software was the key to market saturation, even good-enough software. Most Windows users wanted the apps, and they could care little about the OS as long as it worked. I know that described me when I used Windows 3.1, 95, and 98 before making the Mac OS my primary system.

Could Jobs make a similar mistake today? As Google becomes more and more a competitor for the iPhone's and iPad's apps market, Apple could repeat its 1980s blunders. At the same time, it has some advantages it never had then.  Moreover, might Google  be forced into a more closed system? As I explained to my class today, Google must use the same broadband "tubes" (thanks, Sen. Stevens) to move data to one's computer or phone.

I'd feel better if I could get an Android OS tablet (what I'm waiting for) with service from any of dozen ISPs. Perhaps if some new and distruptive wireless service emerges, offering FIOS-level speeds without a cable, we could have 100 Verizons or Comcasts vying for our telecom dollar.

But in my region, it will be AT&T and Verizon carrying the data. As long as principles of Net Neutrality continue to hold for these big telecoms, in theory iPad / Phone competitors could run most anything and undermine Apple's closed-but-reliable model for access to content online.

If that happens, someone will have stolen Apple's message from the famous "1984" advertisement. I offer it here in case my students have never seen it:


I'm hoping for this outcome, even if it means that in a few years, I'll lose my Macs as Apple fades away (again). Maybe for my main computer I'll be running an admittedly sleek Sony Vaio with a Google OS on board and I'll have lots of options, including synching data with a tablet or smart phone. Maybe I'd even make peace with the old bogeyman, Microsoft. Windows 7 and clunky compared to the current version of the Mac OS, but it is the first version of Windows in a long time that I find compelling and rather intuitive.  Perhaps on a really tricked out machine...

Either way, if I can open the computer's case and change the hard drive out or upgrade the processor, better still.

Wednesday, January 12, 2011

Wu's "Cycle" and Founding Moguls

Location: Using Google's System to Talk About It

This post begins a series to coincide with my Spring 2011 course, Cyberspace: History, Culture, Future.

We begin the class reading Tim Wu's The Master Switch: The Rise & Fall of Information Empires. I've loved Wu's narratives about the "defining moguls" in various industries. In my title I misused the term "founding" and that is not exactly apt. Such men may not found an industry but they do bring it to its apotheosis.  It's not accidental that Howard Hughes, whose company defined modern aerospace, continues to inspire Hollywood enough to get The Aviator made and released.

Such figures used to be called "Titans of Industry" in prewar America.  Either way, the metaphor is apt: Mogul emperors and pre-Olympian gods had one thing in common with these modern men: mere money has never been the primary goal or motivation. They are, Wu, insists, "a special breed" among the alpha-males and (today) alpha-females of our technocracy (29).

The mogul or titan of old desired power and domination, and their modern counterpart, "like a man who tastes combat for the first time, discovered his natural aptitude for industrial warfare" (Wu 29). No small wonder that they so interest Nietzschean me. I admire such fellows even when I dislike them: Steve Jobs made possible the amazing MacBook on which I compose this post. He also has a scorched-earth style of management that would make Kublai Khan nod in approval. Moguls are often brutes (and the titan Saturn ate his own children to avoid being deposed by one of them).

Adolph Zukor's and Mark Zuckerberg's names roll off the tongue, the similarities in vowel-sounds mimicking the same will to power that these men  brought to their innovations. Both Paramount and Facebook define a certain age of media. Competitors quake.

Then, of course, there is the fall of the Titans. In surveying the history of technology, it is common to find founders with Shakespearean flaws and, quite often, trace what began as heroic competition ending  in complacency, decadence, even madness. Aviator and Aerospace-industry legend Howard Hughes stands before me, an obsessive old man so terrified of germs, some urban legends say, that he put empty Kleenex boxes on his feet as he shuffled around his penthouses in various the Las Vegas hotels. Thus is the fate of many with imperial visions.

If any Second Life readers are still with me, I think you know who I'm talking about.  For my students, the question becomes this: what happens when a founder tries to build a system out of a disruptive technology?

Zukor managed to corral and cow the independent filmmakers and theater-owners who successfully battled The Film Trust. Out of that victory, Zukor and other winners made his system: we still use the world "Hollywood" to mean not just a place but a content-creator that is monolithic and quite often stodgy. Consider how every formulaic superhero film resembles the previous formulaic superhero film. We can thank Zukor and his peers for this.  But we can also thank them for some unforgettable moments in cinema: Ben Hur's chariot race, Dorothy's trip to Oz, A Streetcar Named Desire's heart-rending portrayal of Blanche DuBois by Vivian Leigh.

As for Facebook's Zuckerberg, it remains to be seen if his social network can become a fully fledged operating system for users, in the way that Google seems to be evolving. If so, we could have a ringside seat for a battle of systems, one that will influence our lives as much as the ascendancy of the Hollywood Studios, Microsoft's Windows OS, or Detroit's Big Three automakers.

Update  Jan. 13: I've clarified Wu's term for these men and added a properly MLA-cited direct quotation from Wu to add emphasis to my point (and to demonstrate proper integration of sources to students). Note to students that the first citation of Wu does not need his name with the page number; I already made clear the source's name. The second citation does take the page number because the source is not yet clear.  The same rules apply for paraphrase.

Work Cited:

Wu, Tim. The Master Switch: The Rise and Fall of Information Empires. New York: Knopf, 2010.

Saturday, December 11, 2010

Avoiding Boredom and Needing Solitude: Why We Build?

Mr. Roman Nose Does the Roof
Location: Fixing a Virtual Roof

When I am social, in either my physical or virtual lives, it tends to be with folks as smart and, believe it or not, broadly learned as my colleagues in the sciences. I'll party with those wild-men from our Physics Department this weekend. Woo hoo! Pass the tortilla chips and don't ask me to do any integrals!

I eat lunch daily with them, and a sampling of mathematicians and chemists, every day.  Otherwise, I don't "hang out" too much, go to virtual or brick-and-mortar clubs, or get too involved in in-person or virtual forums beyond the handful of blogs I follow and the events at my UU Church.

So Robert Hooker's post about identity in Second Life and his feelings that it does not capture one tenet of Postmoderism, Donna Haraway's "Cyborg" from her famous manifesto, got me thinking about what virtual worlds do mean for me.  Sometimes I'm happiest when there's not another avatar in sight, as has been the case for my SL road-trips (I'm considering continuing that little series, since I rather miss it).

I discovered a connection that I've been trying to articulate for a while in two articles, one in press and one about to go to the editors, about collaboration. Here's what I said in reply to Robert at his blog, as Hiro Pendragon and I both sought clarity about some of Robert's claims:
I'm not in Hiro's league as a builder, and I cannot script, but whenever I get bored with virtual worlds, I build. If you want social-constructivist epistemology that's at the core of Postmodern pedagogy in my field, writing, do a collaborative build with others and then make an immersive simulation.

That's a pretty Postmodern move, appropriating the tools a corporation provides to make something new, even subversive, and ephemeral, one of Hakim Bey's TAZs.

So I've begun to focus less on the avatar and more what avatars (and the users driving them) create.
I had been bored in SL for a long time before I moved my educational activities to Jokaydia Grid, where I'm building everything almost from scratch.

At the Virtual House of Usher, I've done little with the avatar of Roderick Usher, other than making his hair messy and giving him a nicely "Roman" nose. He is bone-stock. Instead, the build has been the identity I'm crafting. Poe's House appeared to be sentient, and its "leaden" presence influences the fate of all three characters in the tale.

I've never been bored, even once, while making things. Next week I'll be helping to build a cedar closet in the physical world and sticking the final bits on the exterior of Usher. Both will be nearly solitary pursuits and, like writing itself, rewarding only insofar as they lead to more creations.

Friday, December 3, 2010

Copyright Sniffing at Second Life Marketplace?

Falling Water in SL
Location: Frank Lloyd Wright Virtual Museum in Second Life

Cry, "Infringement!" and let slip the dogs of copyright!

This is quite the black Friday for those fearing over-zealous IP enforcement.  Tateru Nino and Hamlet Au have covered the snafu of the Frank Lloyd Wright Foundation's decision to not renew permission for The Frank Lloyd Wright Virtual Museum, an homage to the architect's work built in Second Life. I won't cover that, since both bloggers have done such a good job already.

Neither of them note, however, the red meat Linden Lab has thrown to prowling law-dogs, when LL pushed commerce to their online marketplace.

Hamlet notes that one reason for the reversal of an earlier decision to endorse the SL build was the Foundation's outrage over Wright-themed items on Second Life Marketplace. Never mind that the items were not made, sold by, or endorsed by the group in Second Life.

A concurrent event could make this situation far worse for, say, anyone in SL who makes a sneaker that resembles, not that I've ever seen such, a Converse All-Star. Tateru also reports, in another post, on a bill in the US Senate, "The Innovative Design Protection and Piracy Prevention Act," that would permit "prosecution of similar designs for clothing, which need not be limited to physical clothing, but also of virtual items. Formerly, only the trademark text, logos or other iconography on clothing was protected – but now the whole design" would be.

By making changes to search in SL that made many merchants close their in-world stores and migrate the online marketplace, Linden Lab began to reap revenue for each transaction in a way they could not from in-world locations. Of course, it lost tier payments for those merchants who closed their shops. What the Lab probably did not anticipate is how easy it can be to comb through a Marketplace search for "sneaker," even without a SL account, then flag copyright violations. Many interns and junior partners at law firms will be busy for years on this.

I suspect that the quarry will simply find a new home.  Such a waste of legal talent, when copyright holders, such as Wright's Foundation, might instead encourage homage in fan-created items.  After all, no one is making a replica of Falling Water in the world outside my window, then putting it up for sale as "just like Wright's masterpiece!"

It's ironic that Wright chose the term "Taliesin" for his studio.  In Welsh myth, the trickster/demigod/bard stole wisdom from the Goddess of the Underworld, Ceridwen. The goddess pursued Taliesin, who kept changing forms to evade her and her wrath. She finally got him, changing herself into a hen who ate the trickster, who had become a grain of wheat.  Ceridwen became pregnant and birthed a beautiful child, and she was unable to slay the reborn Taliesin.

Ultimately, Taliesin got away with his affront to a deity. I suspect that the sleek and greedy hounds of Copyright Law will make a lot of money chasing the protean figures who follow Taliesin's example. And yet, after a long chase, the hounds will lose their quarry in the wilds of the Internet.

Will Linden Lab lose business? No doubt. More commerce may come back in-world, but some will just shimmer and vanish, like Taliesin's becoming a salmon and swimming away.

Monday, November 22, 2010

Texter or Gamer: Which Are You?

Outside Platos Cave
Location: Solitary Pursuit Called Writing

I'm a hermit by inclination. Whether it's real life or the shimmering and consensual hallucination called a virtual world, I like my quiet. I don't appreciate the random IM, the unsolicited chat-request when I'm replying to my electronic mail. Lots of folks who fancy themselves writers seem to be that way.

This reaction is a long-term one that built over many years, but until I read "Growing Up Digital, Wired for Distraction" in the New York Times, I did not fully understand why. Writer Matt Richtel notes that what Sherry Turkle has called "always on, always on you" technology has created new social types on campus, "not the thespian and the jock but the texter and gamer, Facebook addict and YouTube potato."

My colleagues who gather to discuss education in Second Life, and who post to blogs about it, often confuse these types. There's a conflation of ideas that runs like this: our students live virtual lives already and virtual worlds are an inevitability for them.

This is a mistake. Mediation and virtuality are separable to them, if not to us. There's a huge difference between the augmentationist who is always on Facebook or texting people known already and the loner who chooses to get immersed as a alter-ego, then connect a guild or a few distant gamer-pals through an MMORPG. Most Richmond students, who tend to be socially adept and careerist, are not loners by inclination and they've been scared by stories of gamers who end up where they began: mom and dad's basement.

Ironically, they hurt their grades either way, as the NYT story shows.

In my case, I escaped the basement, though I never had my bedroom down there. We had a nice dry cellar with 1970s wood paneling, and, yes, the D&D group met there in the 70s and 80s. I even escaped the lure of online gaming because of the massive amounts of time needed to be good at a game and the inability to make one's own game; I've long been the game-master type rather than the player. Academics and folks who write a lot tend to tilt that way, too.

But I'm neither traditional texter nor gamer. So are many of those who made SL what it is today.

It's possible that Linden Lab's recent move to stress the social aspects of Second Life over its creative aspects is a wise move: there are more texters than gamers out there. This leads me to think about which social networkers they want. If my students are any indication, they already have all of the social network they need; it consumes enough of their time to hurt their intellectual work. They have no patience for a non-intuitive interface such as SL's.

Will my colleagues who buy into triumphalist narratives about the course of networked technology "get this"? Not until they come to understand the shaping power of various technologies and the habits of use of various generations. There will be exceptions; one colleague is just as wrong in claiming that the Linden-Lab ideal customer is a bored housewife.

Finding a sweet-spot demographic is Linden Lab's problem to solve. Since I'll likely not be teaching in SL again, but only in focused-and-directed simulations in OpenSim, I have time for other worries. I ask a neo-luddite's questions about our networked lives and worry more about the next generation of young people. They are even more addicted to portable devices and easy connectivity than the ones I now teach.

How on earth can they be taught to listen to what silence can say?