Source of book: Borrowed from the
library
I discovered John Warner (who has a column with the Chicago
Tribune as well as an excellent blog) through Peter Greene, who writes about
education and education policy. You should check out Warner’s blog as well, which
covers a lot of territory from a thoughtful, progressive viewpoint.
Warner is a busy man, teaching
creative writing at College of Charleston, and editing the delightful internet
humor site, McSweeney’s Internet
Tendency.
More Than Words is all
about writing and AI, and is as good as anything I have read. Warner keeps an
even tone, a fair and open mind, but also refuses to back down on his core
points.
I personally believe that AI is a
huge bubble right now, and will crash and burn pretty spectacularly before too
much longer. It is mostly hype - as bubbles are - and makes wild promises that
it cannot deliver and will never deliver using its current form.
It’s not a matter of adding
ever-increasing processing power. The whole premise that LLMs are
“intelligence” in any meaningful sense is just straight up bullshit.
For Warner, who teaches writing,
anything that an AI can do isn’t actual writing - it’s an automation, just like
a robot assembling widgets.
Writing requires thinking, not
mere grammatical assembly, which is a distinctly human process. AI doesn’t
think - it literally has no idea of meaning. It instead predicts how actual
humans would string together words. Which is not only why it is prone to
“hallucinations” - GIGO applies here - but why it cannot actually evaluate
truth at all. It has no concept of it because it does not understand what it
is saying.
Warner correctly notes that “AI”
is false advertising. It isn’t “intelligence” at all - it is automation.
That can’t be fixed by more
computing power. It would require a completely different approach and level of
technology that isn’t even being pursued right now.
In case it wasn’t obvious, this
blog is not written by AI. It is written by me, the Autodidact, personally. I
also do not use AI to summarize the books I read - I personally read them, take
notes, and write about my thoughts.
Yes, I use the internet to look
stuff up - I try to support things with evidence when needed. But that is a
tool, not the source of my writing itself.
Like Warner, I believe that the
fundamental purpose of writing - indeed language itself - is to communicate
with other humans. Sure, language is imperfect, and communication can never be
complete. But the very process of communication is key to human society,
empathy, and so much else.
For Warner, this goes both ways.
Students should not be expected to take things more seriously than teachers. If
a teacher is going to use a computer program to evaluate a paper, for example,
the student will correctly understand that using a computer program to write
the paper is the same thing. It’s just machines talking to machines at that
point. Which is why, in my own Wills and Trusts class, I tell my students that
I will be personally reading everything, so write with the intent of
communicating to me. I will respect them enough to read what they write.
This leads into the issue of what
“writing” actually matters. Too much of writing (and I include the law school
essays we all have to assign in preparation for the bar exam) is fake. It is
more about reproducing a format than actually thinking. Which is why AI can
actually do it. The same applies to many school assignments, which are a
temptation for cheating because they are essentially meaningless.
My students had been incentivized not to write but instead to
produce writing-related simulations, formulaic responses for the purpose of
passing standardized assessments. This happens not because teachers are bad or
students lack ability but because these simulations have been privileged in a
system where “schooling” is divorced from “learning.”
I could not agree more. Warner
goes on to note that what he can learn from AI is which assignments actually
mean something. If AI can do it, it is meaningless.
[Note here: I want to call out my
longtime musical colleague and my kids’ history teacher Ernie for his approach
to essays - students pick the topic themselves from the study topics of that
month, and write what they want about it. My kids confirm that they learned so
much about actual writing and thinking from the freedom - and the feedback from
their teacher.]
In my ongoing quest to make the experience of writing
meaningful for students, for teachers, for those at work, and for those at
play, I see ChatGPT as an ally. If ChatGPT can do something, then that thing
probably doesn’t need to be done by a human being. It quite possibly doesn’t
need to be done period.
The challenge is to figure out where humans are
necessary.
What writing should be is an
expression of our humanity, not mechanical assembly of words.
It is frankly bizarre to me that many people find the
outsourcing of their own humanity to AI attractive. It is akin to promising to
automate our most intimate and meaningful experiences, like outsourcing the
love you have for your family because going through the hassle of the times
your loved ones try your spirit isn’t worth the trouble.
And further:
Generating syntax is not the same thing as writing. Writing
is an embodied act of thinking and feeling. Writing is communicating with
intention.
This next passage captures a lot
of my own experience blogging - having written over 1500 posts over a 15 year
period - where my thinking is transformed by the acts of reading and
writing.
Writing is thinking. Writing involves both the
expression and exploration of an idea, meaning that even as we’re trying to
capture the idea on the page, the idea may change based on our attempts to
capture it. Removing thinking from writing renders an act not writing.
Writing is also feeling, a way for us to be invested and
involved not only in our own lives but the lives of others and the world around
us.
Reading and writing are inextricable, and outsourcing our
reading to AI is essentially a choice to give up on being human.
Warner does an excellent job of
debunking the hype about what AI is. It does not think. It does not evaluate.
It does not consider. It has no memory. It has no intention. It is automation,
nothing more, and nothing less.
Large language models do not “write.” They generate syntax.
They do not think, feel, or experience anything. They are fundamentally
incapable of judging truth, accuracy, or veracity. Any actions that look like
the exercise of judgment are illusory. While the term hallucination has
come to mean outputs from LLMs that are incorrect or untrue, it is arguably
more accurate to say that from the point of view of the LLM, everything
is a hallucination, as it has no reference points from which to judge its own
production. ChatGPT is fundamentally as “bullshitter” as defined by Harry
Frankfort in his classic treatise on the term (On
Bullshit), something “unconnected to concern for the truth.” It’s not
that ChatGPT makes stuff up. It has no capacity for discerning something true
from something not true. Truth is irrelevant to its operations.
Totally recommend reading the
Frankfort book, by the way.
One of the weirdest passages in
the book looks at the parallel between people who consult psychics and those
who consult AI. Both require a belief in the underlying illusion, that psychics
really can see the future, and that AI is intelligent.
Neither is true. Rather, as Warner
suggests, “The intelligence illusion is in the mind of the user and not in the
LLM itself.”
Having established that AI is just
another automation, Warner cites Emily Bender (an AI researcher), who notes
that “AI” is a misnomer - and that we should use the correct description of
“automation” and ask the hard questions: what is being automated, why, and who
benefits. And also if it actually does the job expected (usually no), who is
harmed, who is legally and financially responsible for the harm, and how will
we regulate that.
These are the real questions we
need to be asking. And also how to mitigate the environmental destruction AI is
causing through its ludicrously high consumption of water and power.
Throughout the book, Warner talks
about his own experiences, and he does tell a good story. One that I
particularly loved was his description of kindergarten, which largely matches
my own.
Thanks to my ability to get through a Dr. Seuss book on my
own, I started kindergarten with my age cohort, knowing my ABCs and even my
XYZs upon entry, while struggling mightily to learn how to tie my shoes and zip
my coat, facts made apparent by being the last to receive his gold stars on the
class accomplishment poster board kept by my teacher.
Did anyone else have that poster
board? Yep, I struggled with physical coordination, yet I was reading at the
chapter book level by first grade. I was also the shortest kid in my class, and
it wasn’t close. Sigh.
Part of the point of the story is
that Warner is not opposed to automation or technology in writing, per se. Like
me, he hated cursive, and struggled with it. Discovering typing was a game
changer for both of us.
For the first time, I experienced what it was like to capture
my thoughts at close to the speed in which they occurred.
YES!!
I’m also on board with Warner’s
evaluation of cursive, which I have never used since Jr. High. (I type for work
daily, though…)
Those who argue that cursive is a route to teaching fine
motor skills - not for me, but okay - don’t similarly argue for, [Anne]
Trubek’s words, “more useful” skills “such as cooking, sewing, and carpentry.”
The calls for the return to cursive appear to be wrapped up more in a kind of
cultural anxiety, weirdly attached to a feeling of tradition-rooted patriotism
more than any practical, demonstrable benefit to students. One of the common
laments of the pro-cursive crowd is that students can no longer read the
Declaration of Independence in its original documentation, suggesting the power
of the document is in the penmanship rather than the ideas.
So, the problem isn’t automation -
Warner also notes spell check and the delete key in a word processor as key to
his writing process - but the automation of the thinking needed for
writing.
We tend to think of writing as the act of assembling words,
but it’s a deeper experience than this. Words may be symbols, but they are not
abstractions; they are the method by which we express our ideas. Lots of the
writing students produce in school contexts is untethered from ideas, which is
one of the reasons writing in school has become so alienating. Without an
underlying idea, the words have no importance and very little genuine
meaning.
In my own writing, I find that
Warner’s description of ideas and thoughts coming long before words and
sentences to accurate. Each of my posts starts there, before the words go on
the page screen.
I have yet to meet a writer who thinks in sentences. First,
there is thought - be that an image, an idea, a notion, or whatever - and only
then are there words. Often in writing, the final specifics of the words used
to express the ideas and capture the thinking are the last part of the
process.
The chapter on writing as feeling
is particularly excellent. I am an emotional person, as I have increasingly
come to understand as I have grown older, and a lot of my writing isn’t
primarily about intellect, but about processing my emotions, putting down in
words my experience of being human.
Warner recounts the scandal around
the AI condolence statement put out after a school mass shooting. As he
correctly notes, our focus on “thoughts and prayers” rather than substantive
responses leads to a situation where boilerplate is all that can be said.
Maybe because outsourcing expression following tragedy to
tools of automation is the kind of thing that happens in a faceless dystopia.
I also have to talk about the
chapter on writing as a practice, because of a great story. Warner signed up
for Hello Fresh at one point, thinking that it would teach him to cook.
It didn’t.
He soon found out that it was a
“meal prep” service, but that the art (and practice) of cooking can’t be put in
simple instructions. It takes time, practice, and “feel.” I’m a pretty decent
cook, because I started learning as a little kid and cook regularly. This
constant practice over years has given me a comfort level in a kitchen - or on
the trail - with the art of making delicious food.
Ditto for writing.
The best line in the story is, “I
am half-convinced that there is some kind of cooking industry-wide conspiracy
about how long it really takes to brown onions because not once in my
life has it happened according to the prescribed time.”
Warner is correct. Nearly all
cookbooks are bullshit about this. It legitimately takes 45 minutes to properly
brown onions. The two honest writers are Jeff Smith (The Frugal Gourmet) and
Julia Child. That’s literally the list. Plan accordingly. It is worth it for
that sweet stickiness of properly caramelized onions. Trust me on this.
Also great in this chapter are the
takedowns of two cultural myths. The first is the “10,000 hour rule.” As much
as I love Malcolm Gladwell, I agree that this is a myth. The number of hours
isn’t nearly as important as how you spend them. As a violinist, I have put in
those hours. Sometimes they were productive, other times not. Learning how to
be productive is also a practice and an art, which is why a good teacher is so
necessary.
The other is “Grit.” All my kids
had to read this, and they found it tedious. Warner notes that in many cases,
“Grit” can cause you to waste time on something you hate rather than following
the better path for one’s talents.
His analysis of the problem is
interesting.
The 10,000 Hour Rule and Duckworth’s grit theory are
manifestations of a particularly American attitude toward self-improvement that
a better live is right around the corner if you can simply identify and embrace
“one true thing.”
Warner applies this to educational
fads - which is definitely a thing. Because there is not in fact “one true
thing” that solves problems.
I am reminded of one of Bill
Gothard’s false teachings here. After starting with pop-psyche
“self-acceptance” that really wasn’t that at all, and going through the core of
his system, which was authoritarianism of parents over children and the
powerful over the weak, he ended with his principle of “success.” His “one true
thing.”
What was it? Well, just apply his
method of meditating on scripture and God will make you a success in everything
you do.
Yep, a lazy proof-text, a “one
true thing,” and really utter bullshit. There is nothing about contemplating an
ancient holy book that is magic and leads to success. You still need to get off
your ass, learn useful knowledge and skills, and do the work. This is why too
many of the “graduates” from Gothard’s system have zero employable skills, zero
social skills, and zero ability to function in an actual human society. (And
the ones that did acquire those skills did so in spite of Gothard’s
useless curriculum, not because of it.)
Warner closes the chapter with a
solid argument that it isn’t genius that matters - it is skills acquired
through practice - in his case, his ability to write by thinking and expressing
those thoughts in words. I resonated with his description of himself too.
I will know that in terms of intellectual firepower, I’m
reasonably armed, but not tremendously gifted. In my various travels, I have
intersected with genuinely brilliant and uncommonly creative people, and I know
I am not them…I am, happily, entirely ordinary in just about every way.
But I have my writing practice, and that matters.
Yep, that’s me. I’m pretty
ordinary, no genius by any definition. Reasonably armed is all, with the
practice of using words to communicate.
The chapter on the problems with
how we teach reading and writing at the primary school level is good as well. I
too have been frustrated with how little my kids have been expected to read.
It’s almost all excerpts, not whole books. My kids will be fine - they have
been readers since they were young, and devour books. But I do not think this
focus on “teaching to the test” is a good idea.
This kind of relationship to reading is unfortunately foreign
to increasing numbers of young people who have been subjected to a school
curriculum in which they are primarily exposed to short texts or excerpts of
longer ones and then asked the kind of surface-level questions that are
appropriate to multiple choice standardized assessments. Deep reading is
largely absent from the student reading diet because it is harder to assess
against the standards that have come to dominate the curriculum.
Another chapter is on the endless
attempts (dating back a surprisingly long time) to replace teachers with
machines. And yes, B. F. Skinner is mentioned. (I found his utopian
novel to be fascinating, but not a little creepy.)
In the 1950s, B. F. Skinner, the godfather of behaviorism,
was similarly obsessed with the creation of a teaching machine, convinced that
children could better learn if they were simply treated like the pigeons he had
used to test his theories on the importance of immediate feedback and
reward…Despite decades of attempts, Skinner’s machine never caught on. Skinner
blamed schools, teachers, even manufacturers for this failure, never
considering that perhaps children and not the same as pigeons.
A perhaps related concept is the
way that Skinner’s ideas were borrowed by Religious
Authoritarian Parenting gurus, with similar failures to accomplish the
goal. Children are not pigeons. Humans learn socially, not just by instruction.
And teaching is a process of adapting to the individual students and their
learning styles and needs.
Warner makes another good point,
even more relevant in an era when teachers are increasingly devalued by the
American Right:
It is not coincidental that teaching was (and still is) a
female-dominated profession, while the engineering boom of the 1950s and 1960s
was almost exclusively the province of men. This disrespect for teaching rooted
in mid-twentieth-century sexism continues to be manifested today as teachers
are subjected to an ever-changing list of demands without being given the time
and resources necessary to do the job.
But clearly, AI designed by
misogynistic tech-bros can replace those expendable female teachers,
right?
Warner goes further when it comes
to teaching and education. The problem is long-standing, and it is a
misunderstanding - often willful - of the purpose of education. Like so many
horrible things, this one dates to the Reagan administration and a report on
education.
The report established an ethos suggesting the underlying
purpose of an education is to secure material advantage in the competition
against others, be they individuals in the marketplace or foreign nations on
the world stage. The dominant purpose of school would be to rank and sort
students against standards and one another. These rankings would be used to
determine not only which students were worthy but which schools and teachers
were operating effectively as well.
This has led to endless testing
and standards and paperwork and teaching-to-the-test. Warner notes Campbell’s
Law: when a quantitative measure is used for social decision-making, it will
itself distort and corrupt the processes it is intended to monitor. The testing
ruins the teaching.
Campbell’s law manifests itself in schools through the use
and abuse of standardized tests, where the scores on those tests come to stand
in for learning, no matter what methods have been deployed in the service of
raising those scores. Rather than being a tool to gauge students’ cognitive
abilities, tests have become an exercise in seeing how well you do on the test.
By the way, I am saying this as
someone who is pretty good at taking tests. It isn’t the same as knowing things
- which I also aim for, of course.
Warner returns to how this fits
with reading and writing.
Unlike the featureless texts that ChatGPT churns out, human
writing is spiky, weird, and messy. This is particularly true when we are in
the midst of trying to figure stuff out through writing, which is always going
to be the case with students. If I wanted my students to become confident
writers, I had to let them write, and if I was going to let them write, I had
to value something other than the ability to BS proficiently.
As I noted at the beginning,
Warner isn’t a reactionary. He consciously avoids the “kids can’t read these
days” narrative, for example. He also tries not to get too involved in the
specifics of teaching techniques. He trusts teachers.
I am on his side with the
so-called “reading wars,” however. So much of the last, well 50+ years have
been spent on the Phonics jihadists waging scorched-earth war on everyone else.
My poor mom was disabused of this notion early, because not only was I a quick
reader, I memorized words. Sure, I can sound words out. But I didn’t
need to always. (Also, I am dyslexic, and in practice read fairly fast by going
with word shapes rather than sounds. It’s how I read.) Regarding the recent fad
of “science of reading” which has become more of a brand name than an
evidence-based approach - again, note the “one true thing” belief rearing its
ugly head here once again…
I am a conscientious objector to this war, which has taken on
a bizarre cultural-conflict flavor, where people genuinely interested in
exploring how to best help students learn to read have been infiltrated by
political forces who never miss a chance to undermine the public’s faith toward
public schools. When a group both champions the science of reading and
banning books, it seems clear they are not acting out of a passion for
phonics.
And, of course, there is a
shit-load of money to be made selling new curriculum.
These canned curriculums are extremely profitable for the
educational publishers who provide them, but one-size-fits-all mandates ignore
that different individuals learn to read differently. Yes, phonics are key for
lots of readers, but not for every reader. Some students arrive in
school already having surpassed what basic phonics instruction can do for them,
while others need to build knowledge from scratch.
Let me note here that this is one
thing I did love about being homeschooled, and why we did that for our kids
when they were young. Everyone learns differently, and school can tend to be
lock-step particularly in those formative years when kids learn at different
rates. I’ll mention here that my brother was a delayed reader - he didn’t learn
until age 7. But these days, he is one of my sources for book recommendations
because he reads widely and thoughtfully. A regular school probably would have
labeled him as “special needs” rather than wait for his brain to develop at its
own pace.
I also have to mention the
excellent chapter on “content” versus “writing.” My blog aspires to be writing,
not mere content. Which is why it has to be written by a human. AI can - and
increasingly does - create “content.
One of the most immediate and potentially damaging
consequences of generative AI is its potential to drown us in content whose
only purpose is to capture clicks to generate revenue through online
advertising. If this sounds like your current experience of the internet, get
ready for it to become significantly worse.
To fight that, well, subscribe to
real writers, such as Warner. (And perhaps Yours Truly as well.)
I will mention the chapter on the
challenges of compensation for writing, which Warner notes is nothing new - it
dates back to the invention of writing actually. But this issue of “content” is
a challenge for real writers who want to, you know, make a living and all.
Warner has some genuinely good suggestions for this, and optimism that writers
and readers will always be in demand. I think (and hope) he is right.
Quite a bit of fun for me was the
chapter on how AI writes. Warner asks ChatGPT to write an article on a topic
“in the style of John Warner.” The AI has plenty to work off of - Warner is a
prolific writer.
But the result is….weird. Warner
analyzes why it has some surface characteristics of his writing, but none of
the substance. And also weird errors like words he never uses, and
over-emotionality which clearly show a difference with his actual style.
As he puts it, there is a serious
“uncanny valley”
effect. I think that is absolutely correct - I find I can identify AI writing
fairly quickly, and it is for that reason. It shows human features, but is
clearly also not human.
Because thought is the most
important part of writing, Warner notes that he cannot really teach anything
meaningful to a student who does a draft using AI.
If a student comes to me with a text that has been generated
by an AI, we have nothing to talk about, because we cannot discuss what it is
they want to say, because they have yet to say anything.
Also in this chapter is a
hilarious example of AI trying (and failing) to write in an author’s style.
Warner quotes a brilliant description from David Foster Wallace’s hilarious
tour-de-force that is “A
Supposedly Fun Thing I’ll Never Do Again.” It is amazing, and one reason
why I loved the original.
The GPT-4 imitation is so
laughably bad. Beyond bad. It is oddly flat and uninspiring. It has no life.
The words are sort of close in a horseshoes sort of way, but they aren’t quite
right.
I won’t quote the passages, but if
you read the book, you will totally get it.
The last part of the book is about
how Warner thinks we can fight back against AI, and its creeping dehumanization
of writing and reading. As he notes, this risks being dated, as technology will
have changed by the time the book was published.
But actually, I think that his
prescriptions hold up well, and apply not just to AI, but to so much of what is
horrible and dehumanizing about late-stage, corporate capitalism in general. It
is all dehumanizing, the endless monetization, the homogenization, the lack of
actual human soul.
Warner notes throughout the book
that the only reason AI is able to find a niche is that we have already
abandoned our humanity in so many ways. He doesn’t mention it, but since he
wrote the book, popular songs in Country Music and CCM are both AI generated.
The reason these two genres are the first to go this way is that both have been
formulaic for decades. The same cliches, the same sounds, the same pablum.
Sure, there are gems to be found, but they are the exception.
To reclaim this, we need to focus
on our humanity, and use our imagination and ability to connect with others.
To figure this out, I realized I
had to stop thinking about AI and start thinking about humanity.
The fact is, we are embodied. We live our lives
through a series of experiences rooted in a community of fellow humans. If we
are machines, the way we are machines is not meaningful to the joys and sorrows
of what it means to exist as sentient creatures.
I’ll close with one of Warner’s
thoughts that to me seems profound. It’s not just about AI. It is about the way
too many of us outsource our humanity to others. In the context of the
Fundamentalist subculture I escaped from, it is an outsourcing of even morality
itself. But it is more than just an ethics thing, it is all about true
humanity, which cannot exist outside of community and empathy and messiness.
Warner points out that while guides can be helpful, ultimately, we all have to
do the difficult and messy work of becoming human ourselves.
It is important not to mistake a guide for an all-knowing
sage. While it is tempting to wholly outsource the difficult work of
continuously re-forming our own worldviews, letting weirdos like Joe Rogan or
Jordan Peterson, or even non-weirdos like Brene Brown, substitute for your own
judgment weighed against your values is a recipe for confusion and
disappointment.
This is the risk, not just of
turning human communication over to automation in the form of AI, but of
outsourcing the things that make us human to “experts,” be they digital or
other humans. To truly live, to truly be, to truly experience what it means to
be a social animal we call human, we have to do the messy work of continuously
adjusting, learning, growing, connecting.
Our human superpower truly is
language, and to turn that power over to a non-human automaton is to lose
something important. Warner’s book is all about that: an encouragement to
remain human, and refuse to give away what makes us what we are.