Tuesday, November 11, 2025

Ten Arguments Against Hyper Data Centers

 

Data Center in Ruins, interior view

We lost the vote in Goochland County; 4 of 5 Supervisors voted to approve a Technology Overlay District (TOD) and Technology Zone that could include data centers. We learned that a likely center would be two million square feet, or 3000' x 200' in size. That's three times the length of USS Nimitz supercarrier or twice the square footprint of the Short Pump Town Center, a huge outdoor mall about 20 minutes from where I live. I've always hated Short Pump and that mall in particular; it was once prime farmland where a friend stabled her horse. It's now a suburban asteroid belt.

The proposed data center is worse. And soon it may be utterly obsolete. I commend to you articles by Bryan Alexander and Noah Smith about why this entire industry could soon find itself in deep trouble and why that matters to the larger economy. 

Other localities will be fighting data-center behemoths that use enormous amounts of energy and water. Their parking lots create heat-islands and they lower property values. It's likely we citizens will be taking the county to court, as plans were modified at the final moment to include 900 additional acres without public discussion. The entire process seemed rushed through, perhaps spurred on by money from industry, before a new Governor in January. She has promised to develop a statewide strategy. I hope so, and perhaps it can preempt the sort of hasty decisions made here. My state has more data centers than any other.

So what are arguments that money-hungry county officials will heed? Quality-of-life issues may help, but I think environmental concerns, sadly as usual, fall on deaf ears. Our nation's brand of capitalism, one I despise, values short-term thinking and profiteering. I'm a Distributist, not a Socialist; I want capital held and decisions made by the largest number of citizens possible. I want data centers small and as rare as possible. 

In any case, here are 10 technical and economic arguments to make with your local officials as you organize against this bonanza. They are based on my remarks at our recent county meeting. You can share these in 3 minutes at a meeting, if you practice! 

  1.  AI drives this rapid growth in data centers. 
  2. The AI industry is not turning a profit and has shown only limited ROI for hundreds of billions in venture capital and now, risky loans. Subscriptions make up a tiny portion of revenue.
  3. The current model of AI relies on “brute force” computing to simulate human reasoning. This method requires huge data centers that drive up electric bills for ratepayers, and it uses lots of ground water. The Microprocessors in data centers need to be replaced every few years, meaning short life-spans for their hardware. 
  4. Real “Artificial General Intelligence” (aka AGI or “Superintelligence”) is unlikely with silicon-based semiconductor technology, yet AGI is the stated goal of OpenAI, Anthropic, Meta, and Google.
  5. Without more powerful new models, investment may well leave this sector, as 80% of firms that have already deployed current AI models have shown no gains in productivity. 
  6. While Meta and Google have varied sources of income, other big AI firms are one-trick ponies. If the gold-rush turns into a bubble that bursts, it will harm the entire economy. What would happen in your locality when a data center goes dark, its owner bankrupt?
  7. NVIDIA just announced a new Spark workstation that brings hardware-based AI to developers’ desktops. Coders no longer need to send work to data centers for processing. 
  8. Soon consumer devices will have this capacity.  When we have AI-on-a-laptop or phone, which is Apple’s goal for Apple Intelligence, the data-center boom will likely go bust. 
  9. Does your locality want to build power-hungry, thirsty data centers that may be obsolete in a few years? 
  10. Quality-of-life issues are a form of return on investment. Start with that, not data centers.

Image source: first try with ChatGPT 5 for prompt "Generate an image of a two million square foot hyper data center in ruins." 


Tuesday, October 14, 2025

AI Bubbles and My County: Saying "Not Yet" to Big Data Centers

Data Center in Rural Area


Note to readers: A version of this post will be submitted to our Board of Supervisors as a formal letter, as well as to a local news outlet. We'll see if it even makes a ripple. 

Even those wary of AI agree that it can do clever work. I use it regularly with students to shape their writing ethically and to design assignments. But can the industry make enough money to continue its progress? A great deal hinges on that answer, including development on the western border of the Richmond Metro Area.

A proposal before Goochland’s Board of Supervisors would allow data centers, perhaps powered by small nuclear reactors, to be built along the Route 288 corridor. Even without considering the environmental hazards, as one who researches the industry, I’ve found economic risks of rushing off quickly to join this AI gold-rush.

CEOs constantly make utopian claims about AI's future, but numbers cloud those predictions. Subscriptions such as mine meant under $2.5 billion in revenue for ChatGPT maker OpenAI in the second half of 2024, according to sources that include The Wall Street Journal and The New York Times. The rest came from venture capital or circular investing, such as Nvidia's and Microsoft's massive stakes in OpenAI.

The firm estimates that building out its data centers would cost $400 billion. For such ambitious plans, companies need a lot more revenue to balance their books. An "AI Bubble" may be inflating, as several economic journalists have warned. Writing for The Atlantic, Rogé Karma notes that “In the first half of this year, business spending on AI added more to GDP growth than all consumer spending combined.” A crash in the AI sector would deal devastating blows to the economy, with local effects when superfluous data centers go dark.

For his part, OpenAI’s CEO Sam Altman focuses on big ideas that seem straight from science fiction. He writes that humans and computers will become one form of life soon, which he calls “The Merge.” I'll quote from his blog, "unless we destroy ourselves first, superhuman AI is going to happen, genetic enhancement is going to happen, and brain-machine interfaces are going to happen." Not all of us would call such things progress, but history shows us that technological breakthroughs take time and a lot of money. Quiet supersonic airliners and nuclear fusion plants have inched closer to reality, over decades.

It well may take that long for AI data centers to not increase ratepayers’ electric bills. We’ve seen that already from existing data centers in the Commonwealth. Before we consign so much open land to data centers locally, let’s consider what experts in the field who are not CEOs or marketers have to say.

A robotics researcher, hired away from my university by Google, insisted to me that human-level thinking cannot be engineered; at best we might imperfectly simulate it, at great expense via a "brute force" computing method run in huge data centers. I recently noted in class how we might improve that, given faster computing using fewer chips, following what’s called Moore's Law. A student challenged me, claiming that silicon-based chips have not enjoyed the leaps in power and speed we expected annually up through the early 2000s.

Several academic papers I looked up support that idea. Without new types of microprocessors or exotic quantum computing, we are stuck with today’s data centers. Concurrently, our AI models remain far from perfect. Alex Reisner of The Atlantic, as well as academic researchers, found developers fudging benchmark tests that measure AI's smarts. They give the software test questions in advance, a tactic beloved by college students for many years. When AI is faces novel tasks, however, it scores far lower. Rogé Karma’s investigative work bears this out; productivity of coders dropped by 20% when working with AI, as they spent more time correcting mistakes.

Other Bubbles have popped with local consequences. A few years ago, I addressed The Board of Supervisors, whose agenda that night included motions to increase suburban-style growth in the east end of the county. I opposed this for many reasons, but one stood out: we had made zoning changes before based on empty promises, when we developed an office park called West Creek for an anchor client, Motorola. Their semiconductor plant would, in theory, have provided many hundreds of jobs.

My late father-in-law, Edward Nuckols, had addressed the Supervisors in the 90s. He was a respected master mechanic and businessman with deep ties to the community; his auto shop, unlike Motorola, had been in operation here for decades. Ed warned that big companies change their minds.

So it came to pass: Motorola backed out of their promises and Goochland was left with over $100 million in debt. One Supervisor reminded me that the county had little choice but to search for other revenue streams to repay that sum. While a fair counterargument, it does not seem we have learned much. Even if Altman and other CEOs are right, such men have no ties to any locality save the posthuman utopias they dream of inhabiting. $100 million of debt means nothing to them.

I’d love to see cheaper, more sustainable AI as a partner in our work. Today's models such as ChatGPT 5, Anthropic Claude, as well as lesser known but powerful tools such as NotebookLM or Research Rabbit, greatly help my students. We engineer prompts to find credible sources that are not hallucinated, we check human-created work for accuracy and voice, we write scripts for podcasts that AI converts to broadcast-quality audio. Plagiarism, the bugaboo of academic AI, concerns me less than does helping my students gain skills for workplaces slowly adopting AI.

My classes emphasize critical thinking when working with AI. So I'll ask Goochland's Board of Supervisors to do some critical thinking and research not influenced by the hype of Silicon Valley's billionaires. Altman claims intelligence will soon be "too cheap to meter." Older folks will recall that fable about electric bills, one made in the 1950s and 60s.

Otherwise, we could be left with more debt and large, empty buildings crumbling in the rain along 288. 

Creative Commons Image: Bioethics.com 

Monday, September 15, 2025

Are Humanists All Like This? Tyler Austin Harper's Strange Ideas About AI on Campus

Cambridge University in Ruins

 I'm fond of The Atlantic; I've subscribed for more than 4 decades. I also enjoy Tyler Austin Harper's articles; his 2024 piece on extreme fishing really moved me with its stories of combat veterans risking their lives at night, swimming out in deep water to find offshore rocks, all to catch striped bass. Austin tried that sport himself; his participation added vividness to his journalism.

As a professor, he is likewise a participant when writing about education. Thus he should know enough about students to avoid the impracticable, even ludicrous, ideas in his newest article, "The Question All Colleges Should Ask Themselves About AI." The title drew me right in, as I too am concerned that institutions of higher education do not comprehend the scope of AI usage by students or what that may portend for their learning. 

So much damage has already been done, before LLMs arrived to thwart the critical-thinking skills we faculty hold so dear. We live in a time when phones and social media have sapped the attention spans and reading comprehension of all ages, and when students from taught-to-the-test K-12 US education lack the ability to construct narratives around the content they do encounter in class. Instead, they often ask me for a rubric and or checklist. When I ask "tell me what you think most interests you from our topic," I usually get blank or anxious looks.

Harper's piece begins in this territory, with worries we share. Soon, however, he turns to a "solution" by Megan Fritts, from U Arkansas at Little Rock, calling for a ban on AI use in the Humanities classroom.

That stance seems not too different from the CCCC Chair's address I rebutted here some time back. I canceled my NCTE membership this year in response to that extremist stance by the organization's leadership.

Yet the similar philosophy that Fritts embraces proves not radical enough for Harper:

Shunning AI use in classrooms is a good start, but schools need to think bigger than that. All institutions of higher education in the United States should be animated by the same basic question: What are the most effective things—even if they sound extreme—that we can do to limit, and ideally abolish, the unauthorized use of AI on campus?

Harper's drive to "abolish" includes: 

  • Getting rid of WiFi, laptops, and phones on campus
  • Making all writing assignments be done in labs without Internet access 
  • Making all students live on campus
  • Ending accommodations for neurodiverse students if they include AI (human helpers would fill the gap, as they did before) 
  • Penalizing heavily any students caught using AI for academic work, including expulsion.

I began to check to see if he were writing for The Onion, ending with a call for wax tablets for all written work. I can imagine such a  policy at my employer, leading to a 95% drop in applications. I can also imagine new AI-friendly schools, many for-profit, rushing to fill the gap.  I can envision ADA-based lawsuits by neurodiverse learners and their parents.

I won't go on more except to say that Harper's draconian and neo-Luddite solution is not a realistic answer.

Instead I'll end with a question that Fritts asks near the start of her article, as she and others on an AI Committee wrestle with the signature pedagogical challenge of our time, "If this robot can write these books better than we can, then what are we doing here?" Well asked. 

Unless we faculty can add value to AI-written content, maybe we don't deserve jobs any more. I've wondered that since the "Theory Wars" in English and related fields of the late 80s and early 1990s. Poorly aping French Post-Structuralist Theory, faculty in PhD programs ruined the lives of many a graduate student and likely turned off legions of potential undergrads who might have studied literature. I saw this in dozens of cases at Indiana University's PhD program. 

At that time, we lost the thread of meaning--and damn the Deconstructionists who managed to deconstruct meaning out of our field. But the meaning destroyed was not only in texts that lacked narrative cohesion under the scrutiny of theory; the meaning lost included the joy (or jouissance, if you read Barthes) of studying literary work and finding connections to improve one's own life.

Fritts takes us back before my field's epistemological disaster, when she notes that "the aim of our disciplines is the formation of human persons." If we start there, enshrining that principle as the most important one that a college or university (in any discipline) can answer, we have made a good start.  

Image: OpenArt, Cambridge University in Ruins 

Wednesday, September 3, 2025

ChatGPT 5 Passes "The Big Log Test"

Image of Robert Plant

You may recall that some time back, I had a hilarious exchange with an earlier version of ChatGPT about Robert Plant's song "Big Log." Now that version 5 appeared, I wanted to see if the AI would still insist that the song had anything to do with logs made of wood.

No rolling logs this time. The AI did some speculating, but not hallucinating. 

I find the exercise playful, something lacking in anti-AI resistance from one important organization in my field. At the same time, The Big Log Test shows how quickly the AI has reduced its hallucinations. I'd invite readers to try something similar with a favorite song/book/film/TV series that may have an aspect that could confuse an AI. Here we go: 

Prompt: This is a routine check I do as your model improves. I'm curious about any meanings in Robert Plant's song "Big Log." Why do you think he chose that title for his song? Do you see any clues in the lyrics? 

Reply:

Good check-in question — "Big Log" is one of Robert Plant’s more mysterious titles. The lyrics themselves are impressionistic, dreamlike, and full of wandering imagery: travel, desire, distance, loneliness, fate. The phrase “Big Log”*never appears in the song, so the title works more like an oblique symbol or signpost. A few possibilities that critics and listeners have suggested:

1. Truckers and Highways – In trucking slang, a “log” is a driver’s logbook, and a “big log” could mean a long haul or a long road. Since the song is steeped in imagery of roads, headlights, and endless travel, the title could gesture toward the journey itself.

2. Nature and Obstacle – A literal “big log” might be something blocking a path, a metaphor for distance, fate, or emotional barriers. The lyrics’ recurring sense of inevitability (“My love is in league with the freeway…”) could fit this interpretation.

3. Wordplay and Surrealism – Plant often uses surreal or playful titles (e.g., “In the Mood,” “Heaven Knows”). “Big Log” may simply be an evocative, earthy phrase chosen to contrast with the ethereal quality of the lyrics and music.

4. Personal/Private Reference – Sometimes Plant titles songs with private jokes, fragments of conversation, or studio nicknames. It’s possible “Big Log” had an offhand origin unrelated to the lyrics, and he leaned into its strangeness.

Clues in the lyrics point strongest to the *travel/journey* theme: headlights, the freeway, endless searching, being pulled by forces outside oneself. “Big Log” might stand in as the symbolic “record” of that journey — the unending trace of desire and distance.

Follow-Up Prompt: So there are no references to logs in the song?

Reply:

That’s right — there are no actual references to logs (trees, wood, or otherwise) in the lyrics of “Big Log.”

The imagery in the song is all about roads, headlights, distance, fate, longing — not forests or lumber. That’s part of what makes the title puzzling: it seems deliberately disconnected from the lyric surface.

If anything, the best textual “bridge” is the idea of a logbook (the record of a journey), which fits the recurring highway and travel motifs. But Plant never makes that link explicit in the words of the song.

Would you like me to give you a quick “lyrics-only” close reading to show how the imagery develops (and where a title like Big Log might feel thematically tethered, even if indirectly)?

Follow-Up Prompt:

You have clearly evolved. Earlier chats references rolling logs, fallen logs, and so forth. Good work.

Reply:

I’m glad you noticed that difference! Earlier, I sometimes filled the “gap” in the title by speculating more literally — things like fallen trees or rolling logs. With another pass at the song, though, it’s clear Plant never brings that imagery in, so it’s more faithful to stick with the journey / logbook / metaphorical record angle.

Plant’s titles often have that teasing quality — they look literal but carry some private or surreal weight (a kind of wink at the listener). “Big Log” works almost like a signpost planted alongside the road of the song, rather than a description of what’s in it.