Writing with ChatGPT
Let's talk about games.
Do you know the difference between purposes and goals?
The goal of a game is the target we aim at during the game... Our purpose with a game, on the other hand, is our reason for playing the game in the first place.
--C. Thi Nguyen
So we play basketball for a variety of purposes: to be with friends, to exercise, to compete, to have fun, to hone our skills chasing excellence, to humiliate the playground bully…
But there’s just one goal in basketball: to outscore your opponent.
We bring our own purposes to the game. But the game supplies us with one or more goals, and challenges us to pursue them within the constraints of the rules.
Basketball is a well-designed game, so going for the goal tends to fulfill the purposes. You have fun and exercise by trying to outscore your opponent.
But when we confuse our purposes and our goals, we can lose sight of what we’re doing.
I might have more fun or humiliate the playground bully more by pantsing him. But arguably, that wouldn’t count as playing basketball anymore. As we lose sight of the goal of the game, we can start to question whether we’re still hooping or just schmooping now.
Now that we get the difference between purposes and goals:
Let’s think about writing with ChatGPT as a possible move in the game of philosophy.
A lot of professors are worried that students are using ChatGPT to write increasingly good essays for them by pasting in the essay prompt and hitting enter, thereby avoiding having to learn anything or develop the skills of self-reflection and self-articulation they need to be fully actualized adults.
These professors are getting something right.
The two main purposes of assigning academic papers are to build student understanding and skills, and to assess where students are. And it’s true—plagiarizing basically short-circuits all of this. It’s like bringing a stepladder to a basketball game. The stepladder is illegal precisely because its presence would greatly undercut our deepest purposes in playing basketball, most notably, deploying and enjoying skill.
So we give students the goal of writing ‘original’ work, which is defined as whatever plagiarism isn’t. That means we need to teach them what plagiarism is, and we try. We show students how to quote and cite the substantial contributions of others to their work.
There’s a lot to be said about how plagiarism harms the plagiarized and breaks trust. But for now, I want to emphasize that plagiarism undercuts the purposes of assigning academic writing by bypassing the relevant skills and understandings. To varying degrees, plagiarizers just find an answer and turn it in without actually practicing philosophy. Worst of all, the plagiarizer tries to pass off their submission as their own and make it seem like they did the philosophical work. (That's a clear form of proceeding without appropriate concern for truth, i.e., bullshitting.)
But my primary goal as a college instructor is not really to teach my students how to avoid plagiarizing. I’m here to teach them philosophy. And the purpose of teaching them philosophy is to get them to think for themselves.
With ChatGPT-4, I think we’re already at a point where what that means has shifted.
What does it mean to teach students to think for themselves in a world where they can produce increasingly sophisticated texts with a few lines of input? I don't think anyone knows yet.
ChatGPT is the biggest technological advance in writing since the word processor because now the gap between initial idea and first draft is just seconds away. You can try weird things out and see where they go. And you can begin critically rewriting much sooner, first in dialogue with ChatGPT, and eventually (the ultimate betrayal) on your own.
It's gotten way easier to produce texts. Now what?
This undeniably changes our relationship with the texts we produce, even those we write from scratch, in ways I need to think more about.
But why assume that ChatGPT compromises students’ ability to think for themselves, or keeps them from developing their skills of self-reflection and self-articulation?
Why can’t students learn to think for themselves in dialogue with ChatGPT?
After all, it can be tricky to coax exactly what you want out of ChatGPT.
Socrates ran into that problem, too. His interlocutors usually fail to answer his questions to his satisfaction, and exit the dialogues perturbed and upset with him. Being spurred on by a conversation partner to think for yourself can be difficult and uncomfortable.
Teaching students to write from scratch is a useful skill, but I’m here to teach philosophy, not composition. (And why shouldn’t composition students try writing with ChatGPT too? I’m not saying it will give better or more interesting results. But why bar experimentation?)
By teaching students to write with ChatGPT, we can go some way towards reviving the ancient practice of philosophical dialogue—the iconic form of Western philosophy we still revisit thousands of years later. Academic journals today are full of monologues and they’re mostly boring. But entering into philosophical dialogue with a sophisticated chatbot can bring us right into the practice of negotiating conflicting values that philosophy basically is.
If I didn’t have an opponent, I’d have to invent my own, at least to personify the other half of my own divided common sense. That’s why Derek Parfit invents the Self-interest Theorist on page 3 of Reasons and Persons…
Okay, before I get derailed, here’s the takeaway:
Maybe teaching students to write freehand wasn't our deepest purpose in assigning academic papers. Maybe at the college level, that's like insisting that college-level students do long division by hand instead of using calculators. You should know how to do both, but calculators are really useful tools. And ChatGPT is, too. So students should learn how to use both.
If you want, you can assign written in-class exams too, and try to keep students off their phones or Google Glasses. (Are those still a thing?) But maybe students can also learn to think for themselves while writing with ChatGPT.
They might have to, just to keep up.