(Not) The Same As It Ever Was
Many years (!) ago, I wrote here about why the Virtual World Second Life had followed, fairly well, the Gartner Group's well known "hype cycle" for new technology. Follow the link here to read about the history of the concept.
I'm reminded by Wagner James Au of New World Notes that other virtual worlds such as Fortnite and Minecraft have thrived, and indeed, have seen some good use-cases made for education.
At the same time, I don't hear of anyone at Capitol One's hive using virtual worlds. We don't see links from Office 365 or Google Workspaces to a virtual world. I have a sense that AI is a different animal for these worker bees and their daily applications, as well as my own students.
I do know a freelance graphic designer who lost her primary client to AI work; the client outsources graphics to AI run by semi-skilled staff in-house. My friend has moved on to ramp up her
greeting-card business. That may color my skepticism that AI is a "fad," as Second Life clearly was.
SL and Gartner
I joined the virtual world in 2007, and yes, I was foolishly an enthusiast. Since I believe in being forthcoming about personal mistakes, I recall telling my students investigating this Strange Land that "you will all have avatars in a few years, for work!" I let slip the same to some colleagues, one of whom aid "you never met a technology you didn't love!"
Cringe.
I'll own being horribly wrong. Prognostication is a dicey thing! A silly image I cobbled together years ago shows the evolution of Second Life avatars (namely, my own, Iggy). At the same time, my update to the old image points toward a new direction in desktop and mobile computing that we all felt, a decade ago, to be far over the horizon.
For many reason, including the difficulty of building simulations, a clunky interface, heavy system requirements, even poor management by Linden Lab, Second Life began a descent from what Gartner calls a "Peak of Inflated Expectations" in 2007.
By 2012 Second Life was well into a "Trough of Disillusionment." As early as 2008 I sensed that the media-party was over. A year later, I cited three factors working against academic use in a post here: Lack of rewards and incentives for faculty, the time needed to master the interface and content, Linden Lab's vacillating leadership and policies, the presence of what I and my students called "the creepy treehouse" of adult users doing sometimes naughty things.
In the for-profit sector, businesses that set up virtual storefronts in 2007 soon fled; a professor of Marketing I helped get started found SL fascinating, but after interviewing in-world merchants, she put it plainly: for brick-and-mortar firms, there's little return on investment in Second Life.
SL continues to be a large virtual world, with better graphics, promises of a new physics engine, and a dedicated base of users, but it did not become a daily experience on the job or at school. The Hype Cycle proved accurate. This has happened before: stand-alone creative hypertexts such as Michael Joyce's afternoon also map onto the cycle, though never regaining the popularity they enjoyed in the 1990s or even a steady base of users. Creative hypertexts appear to have faded utterly.
When a colleague in our campus digital pedagogy cohort claimed that generative AI would follow the same path traced by Gartner, I bristled because this time I feel that Gartner's model does not fit what I am seeing, daily, among students.
I could be wrong if a series of issues afflict AI. So let's compare the Second Life experience to that of generative AI:
The Browser
SL was never browser-based, even as a plugin, unlike the easy-to-master Unity 3D interface; AIs are built already into tools we use daily or can be queried from any Web browser.
Client v. Server Side
SL has high client-side requirements, whereas AI puts the burden on the server farms; these use enormous amounts of energy that my students and colleagues only recently started to notice, with all the inherent dangers related to sustainability.
The Interface
SL has a difficult interface, employing menus worthy of an entire operating system and arcane coding options, whereas Gen AI's query features resemble (for now) those of a familiar search engine but with natural-language input.
Platform Independence
SL and AI both work on multiple operating systems, but SL has never worked well with mobile devices; AI is baked into many apps now, with some Google searches automatically invoking Gemini results and Apple promising Apple Intelligence in their next phones.
Creepy Treehouse
SL has adult content that troubled my students; the guardrails we increasingly see with AI tend to keep users (sometimes clumsily or incompletely) from making Napalm or pornography.
Investment Sources
Investment in SL comes from individual users at this point; AI attracts venture capital, user subscriptions, and big-tech investment (Apple, Microsoft, Meta, Google).
Source of Growth in Academia
Faculty and campus-technologist curiosity spurred interest in SL; student use erupted for AI spontaneously in the Fall of 2022 and remains very strong.
A Shaky Prognosis
To be fair to Second Life, system load has largely remained steady in the last
decade. In my recent experience, I can run SL and a third-party client
called Firestorm on my MacBook using no more than my phone's hotspot to
connect to the world. That's astounding progress.
I don't foresee any of these differences posing a stumbling block for AI, leading to a Trough of Disillusionment, save energy use. We have a climate crisis; the last thing we can afford as a civilization would be to ramp up emissions of Carbon Dioxide.
Yet if Moore's Law continues apace, we'll see AI reap rewards of lower power use because of more powerful processors in the servers that run it. Any disillusionment is likely to be temporary and its "trough" quite shallow.
For these reasons, I don't see the Hype Cycle apply to Generative AI in the same way as it has to several earlier technologies. I may be wrong, but for now Gartner's model does not fit it well.
Image Sources: Wikipedia Commons for Hype Cycle.
No comments:
Post a Comment