Funny thing about academic conferences; they always go smoothly but when I'm not at them, I frequently have bad dreams about attending these meetings. These involve getting lost in long, dark hallways, finding that my room in the conference hotel no longer exists or is flooded, taking the wrong Metro train and ending up in another town at the time I'm supposed to be speaking. In one nightmare, I ended up driving my rental car on an ever-worsening road. Soon the car and I were pinned in on all sides by tall trees as the sun set.
After CCCC 2025 this month, however, I am having a bad dream of the waking sort.
I was gravely disappointed by an address given by the current chair at the Conference on College Composition and Communication. The chair's remarks at the conference echoed her and her co-authors' sentiments in "Refusing GenAI in Writing Studies: A Quickstart Guide," a thoughtful document but one, in its own way, that could prove as dangerous to our work as educators as could Ethan Mollick's overly enthusiastic book Co-Intelligence.
17 April Update: A transcript of the Chair's talk at the conference has been published. It's worth a read but remains, on my first reading, aligned with the ideas in the post linked above.
Of many disagreements I have with the authors' stance, this section struck me as one of its most misinformed moments:
we must be careful of uncritically accepting the notion that GenAI in its current form will inevitably be widely taken up in the corporate sector and we must therefore prepare students for that time now.
That's pretty much an insult to those of us in the field who are trying to grapple with what AI may mean for our schools and students. We have been anything but uncritical, yet as the adult professionals taking my current class tell me, the inevitable has already occurred. Companies are racing to implement AI at many levels; the authors' statement smacks of Ivory-Tower isolationism.
In future posts, I hope to critique other aspects of the refusal guide. The statements about marginalized students, for instance, ignore how powerfully AI can assist neurodivergent writers as well as those from disadvantaged backgrounds. For now, however, I want to focus on why we writing professionals must be at the table as AI becomes an inevitable part of our curricula.
We don't know the pace of that change; AI may hit a reverse salient in its development. One potential setback appears in Matteo Wong's article in The Atlantic; an industry expert claims that AI firms are misleading the press and public about how rapidly their models are improving. Essentially, companies may be fudging data on benchmark tests used to assess AI performance, as compared to human test-takers.
If true, we could have Ethan Mollicks' first model of an AI future: today it is "as good as it gets."
I'd welcome that pause, so we in education could catch up.
Lead or Be Led?
Whatever the trajectory of AI's evolution, we writing professionals have always engaged in a service enterprise. Or is that simply the voice of a (semi) retired and non-tenured writing-center director? All three of the authors are tenure-stream faculty. Yet as I'll explain, that privilege does not protect them or their programs from institutional changes.
Writing programs are not owned by writing directors or even faculty; they belong to an institution and can be shifted around or simply cut much more easily than can an academic department. I saw this happen at a state university nearby; first-year writing was taken from English and placed in a new unit that answered to the Provost. The two tenured writing faculty stayed in English, teaching other things, until they retired. Not long after, the school's writing center moved out of English as well.
How we hire and promote administrators in higher education varies by institution, but in my experience, many newcomers have advanced degrees in fields such as Higher-Education Management and lack the classroom experience of my colleagues. We cannot expect them to appreciate the rarefied culture of the professional scholar, the culture that informs the CCCC's nay-saying. That said, these same administrators are not necessarily flinty-hearted villains.
Many I meet are very concerned, and rightly so, about how AI changes our work as institutions or may threaten higher education as we understand it. At the same time, students and their future employers expect us to provide training in effective communication, and today that includes using AI. Of course to me, best use means employing AI wisely, reflectively, and ethically.
In consequence, I fear that if we in writing do not lead on AI, we will be lead. It is therefore imperative to get ahead of institutional or governmental fiat and be leaders on our campuses, as we shape policy about AI usage. I also fear more than ever, seeing the Quickstart Guide, that senior scholars might, from good intentions, usher in what I have called "The Dark Warehouse University," a dystopian, outsourced future for all but the most elite institutions of Higher Ed.
Toward a Rebuttal: A Quickstart Guide For The Wary Adopter
- Faculty must experiment with AI to learn its affordances and pitfalls. They must test new models and advise administration about their potential for good or ill, regarding how students learn and acquire critical-thinking, research, and writing skills.
- Students must learn to use AI in reflective, ethical, and critical ways, if they wish to add value to its output. If they cannot add value, many of them will not have jobs when they graduate into AI-centric workplaces.
- Resistance by the tenure-stream faculty may be principled but it also further erodes the position of general education, particularly the Humanities, at a time of rising autocracy and attacks on higher education in the US.
- Corporate capitalism drives the US economy and though this author does not like that fact, most of our institutions of higher education could not exist without that economy. We need to understand what drives it, reveal and resist its excesses where feasible, yet acknowledge that our students need to actually find work, a good deal of it in corporate settings.
- We in Writing Studies should lead as champions of ethical, pedagogically effective AI usage. Such an approach would include cases for when we do not wish to employ the technology as well as learning how and when it hampers learning such as developing critical-thinking skills or detecting misinformation by humans or AI.
- We must, as the Quickstart Guide states, teach students the environmental costs, labor practices, and biases involved in building, training, and using AI. At the same time, with colleagues from Computer Science, we should partner to build better AI, just as our campuses pioneered Web applications and technologies such as synchronous conferencing in writing classes.
A former CCCC chair, Cindy Selfe, prudently called upon us to study the technologies that others want us to use in the classroom (2008). Cindy can get after me if I have this wrong, but the current CCCC approach to AI smacks me as a "peril of not paying attention" to a technology far more influential than our writing classrooms and centers. Isabella Buck, in her keynote speech at the 2024 Conference of The European Writing Centers Association, noted how AI creates new content, for good or ill; our prior networked technologies merely shared existing information. Dr. Buck called for us to "future proof" our centers.
I began exploring new tech in the 90s; Cindy and her partner Dickie were mentors to me at a critical point in my development as a teacher and writer. I learned from them to test technologies warily, sometimes playfully too, before bringing them into the classroom.
That spirit of wary, serious play is sorely lacking from the 2025 CCCC leadership's call to refuse AI.
References:
Essid, Joe (2023). "Writing Centers & the Dark Warehouse University: Generative AI, Three Human Advantages," Interdisciplinary Journal of Leadership Studies: Vol. 2, Article 3.
Available at: https://scholarship.richmond.edu/ijls/vol2/iss2/3
Selfe, Cindy (2008). “Technology and Literacy: A Story about the Perils of Not Paying Attention.” Eds. Michelle Sidler, Richard Morris, and Elizabeth Overman Smith. Computers in the Composition Classroom: A Critical Source Book. Boston: Bedford/St. Martin’s, 93-115.
image source: Creative-Commons image from freepix
No comments:
Post a Comment